主讲人 Speaker:Prof.Ronen Eldan (Weizmann Institute of Science)
时间 Time:Fri., 12:00 pm-13:00, Oct.7,2022 (Beijing Time)
地点 Venue:Zoom Meeting ID: 271 534 5558 Passcode: YMSC
课程日期:2022-10-07
Abstract:
A useful way of producing samples from a given measure $\nu$ on a state space $\Omega$ is by providing a Markov chain whose stationary measure in $\nu$. In order to be able to sample efficiently, one needs to show that the chain "mixes" quickly in terms of the rate of convergence of the measure induced by running $k$ steps of the chain to the stationary measure. In this talk we will present a framework which combines ideas from two seemingly-unrelated recent techniques for proving such mixing bounds: (i) the framework of "Spectral Independence", introduced by Anari, Liu and Oveis Gharan, based on high-dimensional expanders, which have given rise to several breakthroughs in the analysis of mixing times of discrete Markov chains and (ii) the Stochastic Localization technique which has proven useful in establishing mixing and expansion bounds for both log-concave measures and for measures on the discrete hypercube. Our framework aims to both unify and extend those techniques, thus providing an approach that gives bounds for sampling algorithms in both discrete and continuous settings. In its center is the concept of a “localization process' which is a martingale taking values in the space of probability measures on $\Omega$. As it turns out, many Markov chains of interest (such as Glauber dynamics) appear naturally in this framework, and this viewpoint provides tools for deriving mixing bounds for the dynamics through the analysis of a corresponding localization process. Generalizations of the concept of Spectral Independence naturally arise from our definitions, and in particular we will show how to recover the main theorems in the spectral independence framework via simple martingale arguments (completely bypassing the need to use the theory of high-dimensional expanders). We demonstrate how to apply our machinery towards simple proofs to many mixing bounds in the recent literature. We will briefly discuss some applications, among which are obtaining the first O(n logn) bound for mixing time of the hardcore-model (of arbitrary degree) in the tree-uniqueness regime. Based on a joint work with Yuansi Chen.
Bio:
Ronen Eldan is a professor at the Weizmann Institute of Science working on probability theory, mathematical analysis, theoretical computer science and the theory of machine learning. He received the 2018 Erdos Prize, the 2022 Blavatnik Award for Young Scientists and was a speaker at the 2022 International Congress of Mathematicians. He was also awarded the 2023 New Horizons in Mathematics Prize.
Video:http://archive.ymsc.tsinghua.edu.cn/pacm_lecture?html=Localization_Schemes_A_Framework_for_proving_mixing_bounds_in_Markov_chains.html