Teacher:Jinshuo Dong
Schedule:Mon. & Fri., 13:30-15:05, Nov. 18-Dec. 27, 2024 (excluding Nov. 29 & Dec. 2)
Venue:C641, Shuangqing Complex Building A (清华大学双清综合楼A座)
文章类型:无
Synopsis:
This crash course will cover concentration inequalities, a fundamental tool in data science, that are advanced versions of statements like "A standard Gaussian has <5% chance of deviating more than two standard deviations from its mean." These tools are frequently co-designed with algorithms, which can present a steep learning curve for newcomers. As an attempt to fix this, we’ll start by sketching algorithms with production-grade code examples, such asorg.apache.datasketches.countmin.CountMinSketch andsklearn.random_projection.SparseRandomProjection, to illustrate how concentration (both scalar and matrix versions) informs design and analysis. Next, we’ll dive into scalar concentration, aiming to unpack the statement: "Lipschitz functions on measures with log-Sobolev or Poincaré inequality are sub-Gaussian or sub-gamma." For matrix concentration, we’ll cover matrix Bernstein inequalities and compare two approaches: non-commutative Khintchine and joint convexity of matrix KL divergence. If time permits, we’ll explore applications in numerical linear algebra, statistical machine learning, or random graph theory.
The recurring (and perhaps magical) theme throughout will be convexity.
The course will be offered on Mondays and Fridays from 1:30 pm to 3:05 pm, beginning November 18. I plan to structure each session by wrapping up the previous one with a review and presentation of the proof details. We’ll then move on to the next topic, presenting the big picture and key lemmas, while leaving the proof for the following session. This approach allows the audience time to reflect and better appreciate the proofs.
Prerequisite:
Basic probability theory: Markov inequality, central limit theorem
Basic convexity: Jensen's inequality
Knowledge about Fenchel duality and Poisson limit theorem is helpful but not required. No algorithmic knowledge is assumed.
Registration: https://www.wjx.top/vm/YCOJUSN.aspx#