High-Dimensional Statistical Learning Theory

Teacher:Yuhong Yang
Schedule:Mon. & Tues., 9:50-11:25 am, Sept. 18-Dec. 20, 2023 (excluding the week of Nov. 27)
Venue:Lecture Hall B725, Tsinghua University Shuangqing Complex Building A(清华大学双清综合楼A座B725报告厅)


Sept. 29-Oct. 6:Mid-Autumn Festival and National Day Holidays, no lectures.

The lecture originally on Oct. 2 will be adjusted to Oct. 7.



High-dimensional statistical learning has become an increasingly important research area. In this course, we will provide theoretical foundations of high-dimensional learning for several widely studied problems with many applications. More specifically, we will review concentration inequalities, VC dimension, metric entropy  and statistical implications, consider high-dimensional linear modeling and extensions (e.g., additive modeling and interaction learning), derive minimax theories for parametric and nonparametric function estimation, and study statistical properties of model selection methods.

Prerequisite: Probability theory, mathematical statistics, linear models

Reference: Martin J. Wainwright (2019) High-Dimensional Statistics: A Non-Asymptotic Viewpoint. Cambridge University Press.

Target Audience: Graduate students 

Teaching Language: Chinese 




Dr. Yuhong Yang is Professor at Yau Mathematical Sciences Center. He received his Ph.D. in statistics from Yale University in 1996. His research interests include model selection, model averaging, multi-armed bandit problems, causal inference, high-dimensional data analysis, and machine learning. He has published in journals in several fields including Annals of Statistics, JASA, IEEE Transactions on Information Theory, IEEE Signal Processing Magazine, Journal of Econometrics, Journal of Machine Learning Research, and International Journal of Forecasting. He is a recipient of the US NSF CAREER Award and a fellow of the Institute of Mathematical Statistics. He is included in the list of top 2% of the world's most cited scientists by Stanford University.


Registration: https://www.wjx.top/vm/OtHvcjq.aspx#