主讲人 Speaker：陈钇帆 (Courant institute, New York University)
时间 Time：9:00-11:00 am, Nov. 20, 2023
Sampling a target distribution with an unknown normalization constant is a fundamental problem in computational science and engineering. Often, dynamical systems of probability densities are constructed to address this task, including MCMC and sequential Monte Carlo. Recently, gradient flows in the space of probability measures have been increasingly popular to generate this dynamical system. In this talk, we will discuss several rudimentary questions of this general methodology for sampling probability distributions. Any instantiation of a gradient flow for sampling needs an energy functional and a metric to determine the flow, as well as numerical approximations of the flow to derive algorithms. We first show how KL divergence is a special and unique energy functional that can facilitate the ease of numerical implementation of the flow. We then explain how the Fisher-Rao metric is an exceptional choice that leads to superior fast convergence of the flow. Finally, we discuss numerical approximations based on interacting particles, Gaussians, and mixtures to derive implementable algorithms for sampling.