The schedule is preliminary and subject to change. Slides will be updated as the term progresses.
| Week | Date | Topic | Assignments | Readings and Resources |
|---|---|---|---|---|
| 1 | ||||
| 1/5 |
Lecture: Introduction (NLP histroy and basics of language modeling) [slides] |
|
||
| 1/6 |
Tutorial (optional): Review of a) probability, linear algebra and calculus and b) useful python/unix commands |
|||
| 1/7 |
Lecture: Language Modeling [LM slides] |
|
||
| 1/9 |
Tutorial (optional): Building language models |
|||
| 2 | ||||
| 1/12 |
Lecture: Text classification - Naive Bayes and evaluation [NB slides] [Evaluation slides] |
|||
| 1/13 |
Tutorial (optional): Learning pytorch |
|||
| 1/14 |
Lecture: Text classification - Logistic Regression [Logistic regression slides] |
Due: HW0 |
|
|
| 1/16 |
Tutorial (optional): Building classifiers with pytorch |
|||
| 3 | ||||
| 1/19 |
Lecture: NN review and word representations [WV slides 1] |
|
||
| 1/20 |
Tutorial (optional): Building text classifiers with word embeddings |
|||
| 1/21 |
Lecture: Word Representations and neural language models [WV slides 2] |
|
||
| 1/23 |
Tutorial (optional): Building text classifiers with pretrained word embeddings |
|||
| 4 | ||||
| 1/26 |
Lecture: Sequence modeling (HMMs and RNNs) [slides] [slides] |
|||
| 1/27 |
Tutorial (optional): Neural language models with BoW/Fixed Window FFN |
|||
| 1/28 |
Lecture: Neural Sequence Modeling (LSTM/GRU) [slides] |
Due: HW1 |
|
|
| 1/30 |
Tutorial (optional): Using RNNs in pytorch |
|||
| 5 | ||||
| 2/2 |
Lecture: Sequence generation, attention and transformers [sequence generation slides] [transformer slides] |
|
||
| 2/4 |
Lecture: Contextualized word embeddings (masked language modeling) [cwe slides] |
|
||
| 6 | ||||
| 2/9 |
Lecture: Pretrained LLMs [slides] |
|||
| 2/11 |
Lecture: Tasks, benchmarks and project information [project info slides] [benchmark slides] |
Due: HW2 |
||
| 7 | ||||
| 2/16 | No class - No class - Reading break | |||
| 2/19 | No class - No class - Reading break | |||
| 8 | ||||
| 2/23 |
Lecture: Using LLMs: Prompting and parameter efficient fine-tuning [slides] [fine-tuning slides] |
Due: Project proposal |
|
|
| 2/25 |
Lecture: Posttraining - instruction tuning [slides] |
Due: HW3 |
||
| 9 | ||||
| 3/2 |
Lecture: Constituency Parsing [slides] |
|||
| 3/4 |
Lecture: Dependency parsing [slides] |
|||
| 10 | ||||
| 3/9 |
Lecture: Posttraining - preference alignment [slides] |
|||
| 3/11 |
Lecture: Final project tips, model debugging and analysis [slides] |
Due: HW4 |
||
| 11 | ||||
| 3/16 |
Lecture: Grounding and multimodal models [slides] |
|
||
| 3/18 |
Lecture: Information retrieval and RAG |
|
||
| 3/19 |
Due: Project milestone |
|||
| 12 | ||||
| 3/23 |
Lecture: Guest Lecture (TBD) |
|||
| 3/25 |
Lecture: TBD - Spring 2025, TAs gave overview of recent multimodal research Visual grounding in 3D (Austin Wang) Text and 3D Representation Learning (Han-Hung Lee) Text-to-Scene Generation (Austin Wang) LLMs for modeling DNA (Chuanqi Tang) |
|||
| 13 | ||||
| 3/30 |
Lecture: Scaling laws and modern LLMs [slides] |
|||
| 4/1 |
Lecture: LLM Agents [slides] |
|||
| 14 | ||||
| 4/6 | No class - No class - Easter Monday | |||
| 4/8 |
Lecture: Final project presentations [slides] |
|||
| 4/9 |
Due: Final project report |