TY - GEN
T1 - Localization Schemes
T2 - 63rd IEEE Annual Symposium on Foundations of Computer Science, FOCS 2022
AU - Chen, Yuansi
AU - Eldan, Ronen
N1 - Publisher Copyright: © 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Two recent and seemingly-unrelated techniques for proving mixing bounds for Markov chains are: (i) the framework of Spectral Independence, introduced by Anari, Liu and Oveis Gharan, and its numerous extensions, which have given rise to several breakthroughs in the analysis of mixing times of discrete Markov chains and (ii) the Stochastic Localization technique which has proven useful in establishing mixing and expansion bounds for both log-concave measures and for measures on the discrete hypercube. In this paper, we introduce a framework which connects ideas from both techniques. Our framework unifies, simplifies and extends those two techniques. In its center is the concept of a 'localization scheme' which, to every probability measure on some space O, assigns a martingale of probability measures which 'localize' in space as time evolves. As it turns out, to every such scheme corresponds a Markov chain, and many chains of interest appear naturally in this framework. This viewpoint provides tools for deriving mixing bounds for the dynamics through the analysis of the corresponding localization process. Generalizations of concepts of Spectral Independence and Entropic Independence naturally arise from our definitions, and in particular we recover the main theorems in the spectral and entropic independence frameworks via simple martingale arguments (completely bypassing the need to use the theory of high-dimensional expanders). We demonstrate the strength of our proposed machinery by giving short and (arguably) simpler proofs to many mixing bounds in the recent literature. In particular, we: (i) Give the first O(n log n) bound for mixing time of the hardcore-model (of arbitrary degree) in the tree-uniqueness regime, under Glauber dynamics, (ii) Give the first optimal mixing bounds for Ising models in the uniqueness regime under any external fields, (iii) Prove a KL-divergence decay bound for log-concave sampling via the Restricted Gaussian Oracle, which achieves optimal mixing under any exp (n)-ivarm start, (iv) Prove a logarithmic-Sobolev inequality for near-critical Ferromagnetic Ising models, recovering in a simple way a variant of a recent result by Bauerschmidt and Dagallier.
AB - Two recent and seemingly-unrelated techniques for proving mixing bounds for Markov chains are: (i) the framework of Spectral Independence, introduced by Anari, Liu and Oveis Gharan, and its numerous extensions, which have given rise to several breakthroughs in the analysis of mixing times of discrete Markov chains and (ii) the Stochastic Localization technique which has proven useful in establishing mixing and expansion bounds for both log-concave measures and for measures on the discrete hypercube. In this paper, we introduce a framework which connects ideas from both techniques. Our framework unifies, simplifies and extends those two techniques. In its center is the concept of a 'localization scheme' which, to every probability measure on some space O, assigns a martingale of probability measures which 'localize' in space as time evolves. As it turns out, to every such scheme corresponds a Markov chain, and many chains of interest appear naturally in this framework. This viewpoint provides tools for deriving mixing bounds for the dynamics through the analysis of the corresponding localization process. Generalizations of concepts of Spectral Independence and Entropic Independence naturally arise from our definitions, and in particular we recover the main theorems in the spectral and entropic independence frameworks via simple martingale arguments (completely bypassing the need to use the theory of high-dimensional expanders). We demonstrate the strength of our proposed machinery by giving short and (arguably) simpler proofs to many mixing bounds in the recent literature. In particular, we: (i) Give the first O(n log n) bound for mixing time of the hardcore-model (of arbitrary degree) in the tree-uniqueness regime, under Glauber dynamics, (ii) Give the first optimal mixing bounds for Ising models in the uniqueness regime under any external fields, (iii) Prove a KL-divergence decay bound for log-concave sampling via the Restricted Gaussian Oracle, which achieves optimal mixing under any exp (n)-ivarm start, (iv) Prove a logarithmic-Sobolev inequality for near-critical Ferromagnetic Ising models, recovering in a simple way a variant of a recent result by Bauerschmidt and Dagallier.
UR - http://www.scopus.com/inward/record.url?scp=85145876476&partnerID=8YFLogxK
U2 - https://doi.org/10.1109/FOCS54457.2022.00018
DO - https://doi.org/10.1109/FOCS54457.2022.00018
M3 - منشور من مؤتمر
T3 - Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS
SP - 110
EP - 122
BT - Proceedings - 2022 IEEE 63rd Annual Symposium on Foundations of Computer Science, FOCS 2022
PB - IEEE Computer Society
Y2 - 31 October 2022 through 3 November 2022
ER -