Instructor:Ning Su
Organizer:Artane Jérémie Siad, Ning Su, David Pechersky, Justin Yeh, Zhen Zhang
Time:Monday, 18:10 - 19:10
Venue:Jing Zhai
This seminar will build a conceptual foundation in machine learning through careful reading of the classical literature. We will cover foundational work in reinforcement learning, large language models, and world models, picking up current developments as they arise naturally along the way.
The format will be one paper per week, read closely and discussed carefully. We will also engage with implementation — understanding how these systems are actually built.
As a companion activity, we will run regular Sunday hackathons — building real projects to develop hands-on fluency alongside the theoretical work.
We are interested in understanding machine learning from the ground up.
Our first talk will be with Ning Su, who will take us through Attention Is All You Need (Vaswani et al., 2017), the foundational paper on the transformer architecture that underlies virtually all modern large language models.
