The schedule is preliminary and subject to change. Slides will be updated as the term progresses.
| Week | Date | Topic | Assignments | Readings and Resources | 
|---|---|---|---|---|
| 1 | ||||
| 1/6 | Lecture: Introduction [slides] | 
 | ||
| 1/7 | Tutorial (optional): Review of a) probability, linear algebra and calculus and b) useful python/unix commands | |||
| 1/8 | Lecture: Language Modeling [LM slides] | 
 | ||
| 1/10 | Tutorial (optional): Building language models | |||
| 2 | ||||
| 1/13 | Lecture: Text classification - Naive Bayes [NB slides] [Evaluation slides] | |||
| 1/14 | Tutorial (optional): Learning pytorch | |||
| 1/15 | Lecture: Text classification - Logistic Regression [Logistic regression slides] | Due: HW0 | ||
| 1/17 | Tutorial (optional): Building classifiers with pytorch | |||
| 3 | ||||
| 1/20 | Lecture: NN review and word Representations [Neural network and classification review] | 
 | ||
| 1/21 | Tutorial (optional): Building text classifiers with word embeddings | |||
| 1/22 | Lecture: Word Representations and neural language models [WV slides 1] | 
 | ||
| 1/24 | Tutorial (optional): Building text classifiers with pretrained word embeddings | |||
| 4 | ||||
| 1/27 | Lecture: Word2Vec [WV slides 2] | 
 | ||
| 1/28 | Tutorial (optional): Neural language models with BoW/Fixed Window FFN | |||
| 1/29 | Lecture: Sequence modeling (HMMs and RNNs) [slides] [slides] | Due: HW1 | ||
| 1/31 | Tutorial (optional): Using RNNs in pytorch | |||
| 5 | ||||
| 2/3 | Lecture: Neural Sequence Modeling (LSTM/GRU) [slides] | 
 | ||
| 2/5 | Lecture: Sequence-to-sequence models [slides] |  | ||
| 6 | ||||
| 2/10 | Lecture: Transformers and contextualized word embeddings [transformer slides] [cwe slides] | 
 | ||
| 2/12 | Lecture: Contextualized word embeedings and project information [project info slides] [benchmark slides] | Due: HW2 | 
 | |
| 7 | ||||
| 2/17 | No class - No class - Reading break | |||
| 2/19 | No class - No class - Reading break | |||
| 8 | ||||
| 2/24 | Lecture: Pretraining and fine-tuning [pretraining slides] [fine-tuning slides] | Due: Project proposal | 
 | |
| 2/26 | Lecture: Few-shot and in-context learning [slides] | Due: HW3 | 
 | |
| 9 | ||||
| 3/3 | Lecture: Constituency Parsing [slides] | |||
| 3/5 | Lecture: Dependency parsing [slides] | |||
| 10 | ||||
| 3/10 | Lecture: Instruction tuning and reinforcement learning from human feedback [instruction-tuning slides] |  | ||
| 3/12 | Lecture: Final project tips, model debugging and analysis [slides] | Due: HW4 | ||
| 11 | ||||
| 3/17 | Lecture: Parameter efficient fine-tuning [fine-tuning slides] |  | ||
| 3/31 | Lecture: Grounding [slides] |  | ||
| 3/20 | Due: Project milestone | |||
| 12 | ||||
| 3/24 | Guest Lecture by Nick Vincent: Emerging Concerns with the Generative AI Data Paradigm (and how academic research and “public ai” can help) | |||
| 3/26 | Lecture: Guest Lecturer | |||
| 13 | ||||
| 3/31 | Lecture: Scaling laws for LLMs [slides] | |||
| 4/2 | Lecture: LLM Agents [slides] | |||
| 14 | ||||
| 4/7 | Lecture: Final project presentations | |||
| 4/9 | Lecture: Conclusion [slides] | |||
| 4/10 | Due: Final project report |