The schedule is preliminary and subject to change. Slides will be updated as the term progresses.
| Week | Date | Topic | Assignments | Readings and Resources |
|---|---|---|---|---|
| 1 | ||||
| 1/5 |
Lecture: Introduction (NLP histroy and basics of language modeling) [slides] |
|
||
| 1/6 |
Tutorial (optional): Review of a) probability, linear algebra and calculus and b) useful python/unix commands |
|||
| 1/7 |
Lecture: Language Modeling [LM slides] |
|
||
| 1/9 |
Tutorial (optional): Building language models |
|||
| 2 | ||||
| 1/12 |
Lecture: Text classification - Naive Bayes and evaluation [NB slides] [Evaluation slides] |
|||
| 1/13 |
Tutorial (optional): Learning pytorch |
|||
| 1/14 |
Lecture: Text classification - Logistic Regression [Logistic regression slides] |
Due: HW0 |
|
|
| 1/16 |
Tutorial (optional): Building classifiers with pytorch |
|||
| 3 | ||||
| 1/19 |
Lecture: NN review and word representations [WV slides 1] |
|
||
| 1/20 |
Tutorial (optional): Building text classifiers with word embeddings |
|||
| 1/21 |
Lecture: Word Representations and neural language models [WV slides 2] |
|
||
| 1/23 |
Tutorial (optional): Building text classifiers with pretrained word embeddings |
|||
| 4 | ||||
| 1/26 |
Lecture: Sequence modeling (HMMs and RNNs) [slides] [slides] |
|||
| 1/27 |
Tutorial (optional): Neural language models with BoW/Fixed Window FFN |
|||
| 1/28 |
Lecture: Neural Sequence Modeling (LSTM/GRU) [slides] |
Due: HW1 |
|
|
| 1/30 |
Tutorial (optional): Using RNNs in pytorch |
|||
| 5 | ||||
| 2/2 |
Lecture: Sequence generation, attention, and intro to transformers [sequence generation slides] [transformer slides] |
|
||
| 2/4 |
Lecture: Transformers [transformer slides] |
|
||
| 6 | ||||
| 2/9 |
Lecture: Pretraining LLMs [pretraining slides] |
|
||
| 2/11 |
Lecture: Tasks, benchmarks and project information [project info slides] [benchmark slides] |
Due: HW2 |
||
| 7 | ||||
| 2/16 | No class - No class - Reading break | |||
| 2/19 | No class - No class - Reading break | |||
| 8 | ||||
| 2/23 |
Lecture: Using LLMs: Prompting and parameter efficient fine-tuning [prompting] [fine-tuning] |
Due: Project proposal |
|
|
| 2/25 |
Lecture: Modern LLM architecture [slides] |
Due: HW3 |
||
| 9 | ||||
| 3/2 |
Lecture: Posttraining - instruction tuning and preference alignment [instruction tuning] [preference alignment] |
|
||
| 3/4 |
Lecture: Posttraining - preference alignment and reasoning [slides] |
|||
| 10 | ||||
| 3/9 |
Lecture: Scaling laws and data [slides] |
|
||
| 3/11 |
Lecture: Final project tips, model debugging and analysis [slides] |
Due: HW4 |
||
| 11 | ||||
| 3/16 |
Lecture: Parsing [constituency parsing] [Dependency parsing] |
|||
| 3/18 |
Lecture: Information retrieval and RAG [slides] |
|
||
| 3/19 |
Due: Project milestone |
|||
| 12 | ||||
| 3/23 |
Guest Lecture by Issam Laradji: Emerging directions in Large Language Models and AI agents |
|||
| 3/25 |
Lecture: Guest Lecture (TBD) |
|||
| 13 | ||||
| 3/30 |
Lecture: Grounding and multimodal models - with talks from TAs (TA talks from Spring 2025 is currently linked) [slides] Visual grounding in 3D (Austin Wang) Text and 3D Representation Learning (Han-Hung Lee) Text-to-Scene Generation (Austin Wang) LLMs for modeling DNA (Chuanqi Tang) |
|
||
| 4/1 |
Lecture: Conclusion [slides] |
|||
| 14 | ||||
| 4/6 | No class - No class - Easter Monday | |||
| 4/8 |
Lecture: Final project presentations |
|||
| 4/9 |
Due: Final project report |