The schedule is preliminary and subject to change. Slides will be updated as the term progresses.
| Week | Date | Topic | Assignments | Readings and Resources |
|---|---|---|---|---|
| 1 | ||||
| 1/5 |
Lecture: Introduction [slides] |
|
||
| 1/6 |
Tutorial (optional): Review of a) probability, linear algebra and calculus and b) useful python/unix commands |
|||
| 1/7 |
Lecture: Language Modeling [LM slides] |
|
||
| 1/9 |
Tutorial (optional): Building language models |
|||
| 2 | ||||
| 1/12 |
Lecture: Text classification - Naive Bayes [NB slides] [Evaluation slides] |
|||
| 1/13 |
Tutorial (optional): Learning pytorch |
|||
| 1/14 |
Lecture: Text classification - Logistic Regression [Logistic regression slides] |
Due: HW0 |
||
| 1/16 |
Tutorial (optional): Building classifiers with pytorch |
|||
| 3 | ||||
| 1/19 |
Lecture: NN review and word Representations [Neural network and classification review] |
|
||
| 1/20 |
Tutorial (optional): Building text classifiers with word embeddings |
|||
| 1/21 |
Lecture: Word Representations and neural language models [WV slides 1] |
|
||
| 1/23 |
Tutorial (optional): Building text classifiers with pretrained word embeddings |
|||
| 4 | ||||
| 1/26 |
Lecture: Word2Vec [WV slides 2] |
|
||
| 1/27 |
Tutorial (optional): Neural language models with BoW/Fixed Window FFN |
|||
| 1/28 |
Lecture: Sequence modeling (HMMs and RNNs) [slides] [slides] |
Due: HW1 |
||
| 1/30 |
Tutorial (optional): Using RNNs in pytorch |
|||
| 5 | ||||
| 2/2 |
Lecture: Neural Sequence Modeling (LSTM/GRU) [slides] |
|
||
| 2/4 |
Lecture: Sequence-to-sequence models [slides] |
|
||
| 6 | ||||
| 2/9 |
Lecture: Transformers and contextualized word embeddings [transformer slides] [cwe slides] |
|
||
| 2/11 |
Lecture: Contextualized word embeedings and project information [project info slides] [benchmark slides] |
Due: HW2 |
|
|
| 7 | ||||
| 2/16 | No class - No class - Reading break | |||
| 2/19 | No class - No class - Reading break | |||
| 8 | ||||
| 2/23 |
Lecture: Pretraining and fine-tuning [pretraining slides] [fine-tuning slides] |
Due: Project proposal |
|
|
| 2/25 |
Lecture: Few-shot and in-context learning [slides] |
Due: HW3 |
|
|
| 9 | ||||
| 3/2 |
Lecture: Constituency Parsing [slides] |
|||
| 3/4 |
Lecture: Dependency parsing [slides] |
|||
| 10 | ||||
| 3/9 |
Lecture: Instruction tuning and reinforcement learning from human feedback [instruction-tuning slides] |
|
||
| 3/11 |
Lecture: Final project tips, model debugging and analysis [slides] |
Due: HW4 |
||
| 11 | ||||
| 3/16 |
Lecture: Parameter efficient fine-tuning [fine-tuning slides] |
|
||
| 3/18 |
Lecture: Grounding [slides] |
|
||
| 3/19 |
Due: Project milestone |
|||
| 12 | ||||
| 3/23 |
Lecture: Guest Lecture |
|||
| 3/25 |
Lecture: Overview of recent multimodal research Visual grounding in 3D (Austin Wang) Text and 3D Representation Learning (Han-Hung Lee) Text-to-Scene Generation (Austin Wang) LLMs for modeling DNA (Chuanqi Tang) |
|||
| 13 | ||||
| 3/30 |
Lecture: Scaling laws for LLMs [slides] |
|||
| 4/1 |
Lecture: LLM Agents [slides] |
|||
| 14 | ||||
| 4/6 |
Lecture: Final project presentations |
|||
| 4/8 |
Lecture: Conclusion [slides] |
|||
| 4/9 |
Due: Final project report |