The schedule is preliminary and subject to change. Slides will be updated as the term progresses.
Week | Date | Topic | Assignments | Readings and Resources |
---|---|---|---|---|
0 | ||||
1/8 |
Lecture: Introduction [slides] |
|
||
1/9 |
Tutorial (optional): Review of a) probability, linear algebra and calculus and b) useful python/unix commands |
|||
1/10 |
Lecture: Language Modeling [LM slides] |
|
||
1/12 |
Tutorial (optional): Building language models |
|||
1 | ||||
1/15 |
Lecture: LM smoothing and evaluation; Text classification - Naive Bayes [NB slides] |
|||
1/16 |
Tutorial (optional): Learning pytorch |
|||
1/17 |
Lecture: Text classification - Logistic Regression [Logistic regression slides] [Evaluation slides] |
Due: HW0 |
||
1/19 |
Tutorial (optional): Building classifiers with pytorch |
|||
2 | ||||
1/22 |
Lecture: NN review and word Representations [Neural network and classification review] [WV slides 1] |
|
||
1/23 |
Tutorial (optional): Building text classifiers with word embeddings |
|||
1/24 |
Lecture: Word Representations and neural language models [WV slides 2] |
|
||
1/26 |
Tutorial (optional): Building text classifiers with pretrained word embeddings |
|||
3 | ||||
1/29 |
Lecture: Sequence modeling (HMMs) [slides] |
|||
1/30 |
Tutorial (optional): Neural language models with BoW/Fixed Window FFN |
|||
1/31 |
Lecture: Neural sequence modeling (RNNs) [slides] |
Due: HW1 |
||
2/2 |
Tutorial (optional): Using RNNs in pytorch |
|||
4 | ||||
2/5 |
Lecture: Neural Sequence Modeling (LSTM/GRU) and sequence-to-sequence models [slides] [slides] |
|
||
2/7 |
Lecture: Attention in sequence-to-sequence models [slides] |
|
||
5 | ||||
2/12 |
Lecture: Transformers and contextualized word embeddings [transformer slides] [cwe slides] |
|
||
2/14 |
Lecture: Benchmark datasets [project info slides] [benchmark slides] |
Due: HW2 |
||
7 | ||||
2/19 | No class - No class - Reading break | |||
2/21 | No class - No class - Reading break | |||
8 | ||||
2/26 |
Lecture: Pretraining and fine-tuning [pretraining slides] [fine-tuning slides] |
|
||
2/28 |
Lecture: Few-shot and in-context learning [slides] |
Due: HW3 |
|
|
9 | ||||
3/4 |
Lecture: Constituency Parsing [slides] |
Due: Project proposal |
||
3/6 |
Lecture: Dependency parsing [slides] |
|||
9 | ||||
3/11 |
Lecture: Parameter-efficient fine-tuning and instruction tuning [fine-tuning slides] [instruction-tuning slides] |
|
||
3/13 |
Lecture: Final project tips, model debugging and analysis [slides] |
Due: HW4 |
||
11 | ||||
3/18 |
Lecture: NLP applications [slides] |
|||
3/20 |
Lecture: Scaling laws for LLMs [slides] |
|||
3/21 |
Due: Project milestone |
|||
12 | ||||
3/25 |
Lecture: Grounding [slides] |
|
||
3/27 |
Lecture: LLM Agents [slides] |
|||
13 | ||||
4/1 | No class - No class - Easter | |||
4/3 |
Guest Lecture by Nick Vincent: Societal Impacts of Generative AI: Economic Factors and More [slides] |
|||
14 | ||||
4/8 |
Lecture: Final project presentations |
|||
4/10 |
Lecture: Conclusion [slides] |
|||
4/11 |
Due: Final project report |