High-dimensional statistical learning has become an increasingly important research area. In this course, we will provide theoretical foundations of high-dimensional learning for several widely studied problems with many applications. More specifically, we will review concentration inequalities, VC dimension, metric entropy and statistical implications, consider high-dimensional linear modeling and extensions (e.g., additive modeling and interaction learning), derive minimax theories for parametric and nonparametric function estimation, and study statistical properties of model selection methods.
Prerequisite: Probability theory, mathematical statistics, linear models
Reference: Martin J. Wainwright (2019) High-Dimensional Statistics: A Non-Asymptotic Viewpoint. Cambridge University Press.
Target Audience: Graduate students
Teaching Language: Chinese
Dr. Yuhong Yang is Professor at Yau Mathematical Sciences Center. He received his Ph.D. in statistics from Yale University in 1996. His research interests include model selection, model averaging, multi-armed bandit problems, causal inference, high-dimensional data analysis, and machine learning. He has published in journals in several fields including Annals of Statistics, JASA, IEEE Transactions on Information Theory, IEEE Signal Processing Magazine, Journal of Econometrics, Journal of Machine Learning Research, and International Journal of Forecasting. He is a recipient of the US NSF CAREER Award and a fellow of the Institute of Mathematical Statistics. He is included in the list of top 2% of the world's most cited scientists by Stanford University.