Theory and Algorithms in Deep Learning: From A Numerical Analysis Perspective

任课教师 Speaker:Juncai He
时间 Time:Tues. 9:50-11:25 am & Wed. 13:30-15:05, April 15-June 4, 2025
地点 Venue:C654, Shuangqing Complex Building A; Zoom Meeting ID: 271 534 5558 Passcode: YMSC
文章类型:

Description:

This course will systematically explore and analyze some key theories and algorithms in deep learning from a numerical analysis perspective. Traditionally, the foundational theory in deep learning is largely concerned with approximation and generalization error estimates for different types of neural networks. We will interpret and study these aspects through a series of more fundamental results, particularly the expressivity of neural networks.

 

We will begin by presenting and proving a series of results on the connections between classical function classes in numerical analysis and deep neural networks with ReLU and ReLU$^k$ activation functions. Building on these foundational results, we will decompose various error estimates into more fundamental components related to the representational and interpolation capabilities of neural networks.

 

Regarding models, algorithms, and network architectures in deep learning, we will analyze them through the lens of multilevel algorithms, a well-established framework in numerical partial differential equations, particularly multigrid methods. If time permits, we will also discuss some fundamental ideas, properties, and theoretical aspects of Transformers in large language models.


Prerequisite:

Calculus, Linear Algebra, Basics of Numerical Analysis and Differential Equations


Reference:

1. Anthony, Martin, and Peter L. Bartlett. Neural Network Learning: Theoretical Foundations. Cambridge University Press, 2009.

2. Goodfellow, Ian, Yoshua Bengio, Aaron Courville, and Yoshua Bengio. Deep learning. Vol. 1, no. 2. Cambridge: MIT press, 2016.

3. Papers by the instructor about this topic


Target Audience: Undergraduate and Graduate Students

Teaching Language: Depends on the audience

Registration: https://www.wjx.top/vm/Y1ZwEVM.aspx#