CS371N: Natural Language Processing (Fall 2023)
NOTE: This page is for an old semester of this class
Instructor: Greg Durrett, gdurrett@cs.utexas.edu
Lecture: Tuesday and Thursday 9:30am - 10:45am, JGB 2.218
Instructor Office Hours: Tuesdays 5pm, Thursdays 3pm, GDC 3.812 and on Zoom (hybrid; see Canvas for link)
TAs: Manya Wadhwa, Jennifer Mickel
TA Office Hours:
- Monday 11am, on Zoom [Manya]
- Monday 5:30pm, hybrid, Desk 2 GDC TA Station (1st floor) [Jenn]
- Wednesday 11am, hybrid, Desk 2 GDC TA Station (1st floor) [Jenn]
- Wednesday 3:30pm, in-person, Desk 1 GDC TA Station (1st floor) [Manya]
See Canvas for a link to the discussion board (Ed Discussion)
Description
This course provides an introduction to modern natural language processing
using machine learning and deep learning approaches. Content includes
linguistics fundamentals (syntax, semantics, distributional properties of
language), machine learning models (classifiers, sequence taggers, deep
learning, language models), key algorithms for inference, and applications to a range of
problems. Students will get hands-on experience building systems to do tasks
including text classification, language modeling, and textual entailment.
Requirements
- CS 429
- Recommended: CS 331, familiarity with probability and linear algebra, programming experience in Python
- Helpful: Exposure to AI and machine learning (e.g., CS 342/343/363)
Course Details
The course lectures will be delivered in a traditional, in-person format. Recordings will be made available
via the LecturesOnline service for students to browse after class. All course materials will be posted
on this website. Note that additional pre-recorded video content overlapping with the concepts in this course
is available on the CS388 website.
The exam will be given in-person. Contact the instructor ASAP if this poses a problem for you.
Assignments:
Assignment 0: Warmup (ungraded) [nyt dataset] [tokenizer.py] [solutions]
Assignment 1: Sentiment Classification (due September 7, 11:59pm) [Code and data]
Assignment 2: Feedforward Neural Networks and Optimization (due September 21, 11:59pm) [Code and data]
Assignment 3: Transformer Language Modeling (due October 5, 11:59pm) [Code and data]
Assignment 4: Sequence Modeling and Parsing (due October 17, 11:59pm)
Midterm (topics) [midterm in-class on Thursday, October 19] [fall 2022 midterm / solutions, fall 2021 (longer than this one will be) midterm / solutions, fall 2020 (longer than this one will be) midterm / solutions]
Assignment 5: Factuality of ChatGPT (due November 7, 11:59pm) [Code and data]
Final Project: Dataset Artifacts [instructions], [Code], [instructions for independent projects] (custom proposals due October 27)
Readings: Textbook readings are assigned to complement the material discussed in lecture. You may find it useful
to do these readings before lecture as preparation or after lecture to review, but you are not expected to know everything discussed
in the textbook if it isn't covered in lecture.
Paper readings are intended to supplement the course material if you are interested in diving deeper on particular topics.
Bold readings and videos are most central to the course content; it's recommended that you look at these.
The chief text in this course is Eisenstein: Natural Language Processing,
available as a free PDF online.
(Another generally useful NLP book is Jurafsky and Martin: Speech and Language Processing (3rd ed. draft), with many draft chapters available for free online; however,
we will not be using it much for this course.)
Schedule (subject to change through the first day of classes)
Date |
Topics |
Readings |
Assignments |
Aug 22 |
Introduction [4pp] |
|
A0 out (ungraded) |
Aug 24 |
Classification 1: Features, Perceptron |
Classification lecture note
Perceptron Loss (VIDEO)
perc_lecture_plot.py
Eisenstein 2.0, 2.1, 2.3.1, 4.1, 4.3
|
A1 out |
Aug 29 |
Classification 2: Logistic Regression |
Classification lecture note
Optimization (VIDEO)
Jurafsky and Martin 5.0-5.3
|
|
Aug 31 |
Classification 3: Multiclass, Sentiment, Authorship (slides: [1pp] [4pp]) |
Multiclass lecture note
Eisenstein 2.4.1, 2.5, 2.6, 4.2
Pang+02
Wang+Manning12
Socher+13 Sentiment
Schwartz+13 Authorship
|
|
Sept 5 |
Fairness / Neural 1: Feedforward, Backpropagation [4pp] (handwritten notes: pdf) |
Fairness (VIDEO)
HutchinsonMitchell18 Fairness
Eisenstein 3.0-3.3
Goldberg 4
|
|
Sept 7 |
Neural 2: Implementation, Word embeddings intro [4pp] |
Neural Net Optimization (VIDEO)
ffnn_example.py
Eisenstein 3.3
Goldberg 3, 6
Iyyer+15 DANs
Init and backprop
|
A1 due/A2 out |
Sept 12 |
Neural 3: Word embeddings [4pp] (notes: [pdf]) |
Eisenstein 14.5-14.6
Goldberg 5
Mikolov+13 word2vec
Pennington+14 GloVe
|
|
Sept 14 |
Neural 4: Bias, multilingual [4pp] |
Bolukbasi+16 Gender
Gonen+19 Debiasing
Ammar+16 Xlingual embeddings
Mikolov+13 Word translation
|
|
Sept 19 |
LM 1: N-grams, RNNs |
Eisenstein 6.1-6.2
|
|
Sept 21 |
LM 2: Attention, Self-attention |
Luong+15 Attention
Vaswani+17 Transformers
Alammar Illustrated Transformer
|
A2 due / A3 out |
Sept 26 |
LM 3: Transformers, Implementation [handwritten notes] (slides: [1pp] [4pp]) |
Vaswani+17 Transformers
Alammar Illustrated Transformer
|
|
Sept 28 |
LM 4: Transformer Language Models / Pre-training 1: Encoders (BERT) [4pp] |
Kaplan+20 Scaling Laws
Peters+18 ELMo
Devlin+19 BERT
Alammar Illustrated BERT
|
|
Oct 3 |
Pre-training 2: Decoders (GPT), Tokenization, Decoding Methods [4pp] |
BostromDurrett20 Tokenizers
Radford+19 GPT2
Brown+20 GPT3
Holtzman+19 Nucleus Sampling
|
|
Oct 5 |
Sequence 1: Tagging, POS, HMMs |
Eisenstein 7.1-7.4, 8.1
|
A3 due/A4 out |
Oct 10 |
Sequence 2: HMMs, Viterbi |
Viterbi lecture note
Eisenstein 7.1-7.4
|
|
Oct 12 |
Trees 1: PCFGs, CKY (slides: [1pp] [4pp]) |
Eisenstein 10.1-3, 10.4.1
|
|
Oct 17 |
Trees 2: Dependency / Midterm review [4pp] |
Eisenstein 11.3-4
KleinManning03 Unlexicalized
ChenManning14
Andor+16
|
A4 due |
Oct 19 |
Midterm (TBD) |
|
|
Oct 24 |
Understanding GPT3 1: Prompting GPT-3, Factuality [4pp] |
Zhao+21 Calibrate Before Use
Min+22 Rethinking Demonstrations
Gonen+22 Demystifying Prompts
Min+23 FActScore
Gao+22 RARR
Olson+22 Induction Heads
|
A5 out |
Oct 26 |
Understanding GPT3 2: Rationales, Chain-of-thought [4pp] |
Camburu+18 e-SNLI
Wei+22 CoT
YeDurrett22 Unreliability
Kojima+22 Step-by-step
Gao+22 Program-aided
Ye+22 Complementary
|
Custom FP proposals due Oct 27 |
Oct 31 |
Understanding GPT3 3: Instruction tuning, RL in NLP [4pp] |
Sanh+21 T0
Liu+21 Prompting
Chung+22 Flan-PaLM
Ouyang+22 Human Feedback
Ramamurthy+22 RL for NLP
Roller+20 Facebook Blender
Gehman+20 Toxicity
Thoppilan+22 LaMDA
|
|
Nov 2 |
Understanding NNs 1: Dataset Bias [4pp] |
Gururangan+18 Artifacts
McCoy+19 Right
Gardner+20 Contrast
Swayamdipta+20 Cartography
Utama+20 Debiasing
|
|
Nov 7 |
Understanding NNs 2: Interpretability [4pp] |
Lipton+16 Mythos
Ribeiro+16 LIME
Simonyan+13 Visualizing
Sundararajan+17 Int Grad
Bansal+20 Whole Exceed Parts
Interpretation Tutorial
|
A5 due / FP out |
Nov 9 |
Guest Lecture: Eunsol Choi, Question Answering [4pp] |
|
|
Nov 14 |
Machine Translation, Multilinguality [4pp] |
|
|
Nov 16 |
Language Grounding [4pp] |
|
|
Nov 21 |
NO CLASS |
|
|
Nov 23 |
NO CLASS |
|
|
Nov 28 |
No class: project clinic |
|
|
Nov 30 |
Wrapup + Ethics [4pp] |
HovySpruit16 Social Impact of NLP
Zhao+17 Bias Amplification
Rudinger+18 Gender Bias in Coref
BenderGebru+21 Stochastic Parrots
Gebru+18 Datasheets for Datasets
Raji+20 Auditing
|
FP due Dec 8 |