Course Description

The application of neural network methods---under the name "deep learning"---has led to breakthroughs in a wide range of fields, including in building language technologies (e.g. for search, translation, text input prediction). This course will provide a hands-on introduction to the use of deep learning methods for processing natural language. Methods to be covered include static word embeddings, feed-forward networks for text, recurrent neural networks, transformers, pre-training and transfer learning, with applications including sentiment analysis, translation, and generation.

Days Time Location
Monday and Wednesday 1:00 - 2:20 PM ARC G070
https://washington.zoom.us/j/98228669635

Note: while lectures will be delivered live at the above time and location, they will also be recorded and posted to the course Canvas page.

Teaching Staff

Role Name Office Office Hours
Instructor Shane Steinert-Threlkeld GUG 415K and
https://washington.zoom.us/my/shanest
Wednesday 3-5PM
Teaching Assistant Saiya Karamali GUG 407 and
https://washington.zoom.us/s/92010041700
Tuesday 3:30 - 5:30PM

Recommended Textbooks

While relevant readings are posted in the schedule below, the following are very good general resources. Names that are used to refer to these works are included in parentheses.

Prerequisites

  • LING 572 or equivalent machine learning course
  • Programming in Python
  • Linux/Unix Commands
  • Linear algebra
  • Multivariable calculus (especially partial derivatives / gradients of multivariable functions)

Course Resources

N.B.: All homework grading will take place on the patas cluster using Condor, so your code must run there. I strongly encourage you to ensure you have an account set up by the time of the first course meeting.

Policies

Unless explicitly mentioned below, the shared policies of the LING 57x course series apply to this course. Please read those policies for more information.

We understand that you may face hard times as we navigate an ever-changing world due to the COVID-19 pandemic and many other world events. If you find yourself struggling with a difficult concept; stressed over politics or health; slowed by monopolistic internet providers; or annoyed at a classmate, please remember that they might feel similar. Maybe not in your very moment, but certainly recently or soon. Some of you may find the return to hybrid teaching conducive to your style of learning and personality. Others may find it stressful or difficult. These are all normal reactions. Please have compassion and empathy, and assume that everyone is doing their best.

If you find yourself having trouble learning in class, please do not hesitate to let me or Saiya Karamali know. Our goal is to make this class a bright spot in these unprecedented times, and to do whatever we can to promote a healthy learning environment for all.

A note on time zones

All deadlines and meeting times for this class are in "Pacific Time". Now that we are in Daylight Savings Time, this is UTC-7.

Grading

  • 100%: Homework Assignments
  • Up to 2% adjustment for significant in-class or discussion participation

Communication

As per the policy above, all communication outside of the classroom should take place on Canvas. You can expect responses from teaching staff within 48 hours, but only during normal business hours, and excluding weekends.

N.B.: while CLMS students have a private Slack channel, I strongly encourage questions concerning course content and assignments to be posted to the Canvas discussion board, for two reasons. (i) Teaching staff will not look at Slack, so misinformation can spread. (ii) Not every student in the course is in the CLMS program, but they deserve to be included in course discussions and likely have many of the same questions.

Religious Accommodation

Washington state law requires that UW develop a policy for accommodation of student absences or significant hardship due to reasons of faith or conscience, or for organized religious activities. The UW’s policy, including more information about how to request an accommodation, is available at Religious Accommodations Policy (https://registrar.washington.edu/staffandfaculty/religious-accommodations-policy/). Accommodations must be requested within the first two weeks of this course using the Religious Accommodations Request form (https://registrar.washington.edu/students/religious-accommodations-request/).

Access and Accommodations

Your experience in this class is important to me. If you have already established accommodations with Disability Resources for Students (DRS), please communicate your approved accommodations to me at your earliest convenience so we can discuss your needs in this course.

If you have not yet established services through DRS, but have a temporary health condition or permanent disability that requires accommodations (conditions include but not limited to; mental health, attention-related, learning, vision, hearing, physical or health impacts), you are welcome to contact DRS at 206-543-8924 or uwdrs@uw.edu or disability.uw.edu. DRS offers resources and coordinates reasonable accommodations for students with disabilities and/or temporary health conditions. Reasonable accommodations are established through an interactive process between you, your instructor(s) and DRS. It is the policy and practice of the University of Washington to create inclusive and accessible learning environments consistent with federal and state law.

Safety

Call SafeCampus at 206-685-7233 anytime – no matter where you work or study – to anonymously discuss safety and well-being concerns for yourself or others. SafeCampus’s team of caring professionals will provide individualized support, while discussing short- and long-term solutions and connecting you with additional resources when requested.

Schedule


Date Topics + Slides Readings Events
Mar 25 Introduction / Overview; History
Mar 27 Gradient descent; Word vectors JM ch 5.4-5.6, ch 6
YG ch 2
HW1 out
[ pdf , tex , slides ]
[due Apr 4 6]
Apr 1 Word vectors; Classification and language modeling JM 6.8 - 6.12
Apr 3 Neural Networks 1 JM 7.1 - 7.4
YG ch 4
HW2 out
[ pdf , tex , slides ]
[due Apr 11]
Apr 8 Computation graphs; backpropagation
edugrad library
JM 7.5.3 - 7.5.5
YG 5.1.1 - 5.1.2
GBC ch 6.5

Calculus on computational graphs
CS 231n notes 1

Yes, you should understand backprop
Apr 10 Feed-forward networks for LM and classification JM 7.6
YG ch 9

A Neural Probabilistic Language Model (Bengio et al 2003)
Deep Unordered Composition Rivals Syntactic Methods for Text Classification (Iyyer et al 2015)
HW3 out
[ pdf , tex ]
[due Apr 18]
Apr 15 Recurrent neural networks JM 9.1-9.5

The Unreasonable Effectiveness of Recurrent Neural Networks
Apr 17 Vanishing gradients; RNN variants JM 9.6
YG ch 15

Understanding LSTMs
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
On the difficulty of training recurrent neural networks
HW4 out
[ pdf , tex , slides ]
[due Apr 25]
Apr 22 Sequence-to-sequence; Attention JM ch 10

Sequence to Sequence Learning with Neural Networks (original seq2seq paper)
Neural Machine Translation by Jointly Learning to Align and Translate (original seq2seq + attention paper)
Apr 24 Transformers 1 JM 9.7-9.9

Attention is All You Need (original Transformer paper)
The Annotated Transformer
The Illustrated Transformer
HW5 out
[ pdf , tex ]
[due May 2]
April 29 Transformers 2 "
May 1 Pre-training / fine-tuning paradigm JM ch 11
Contextual Word Representations: Putting Words into Computers
The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
HW6 out
[ pdf , tex ]
[due May 9]
May 6 Pre-training / fine-tuning paradigm (cont.) "
May 8 Interpretability and Analysis Analysis Methods in Natural Language Processing
A Primer in BERTology
HW7 out
[ pdf , tex ]
[due May 16]
May 13 Overflow
May 15 From Language Models to Large "Language Models" Finetuned Language Models are Zero-Shot Learners
Training language models to follow instructions with human feedback
Direct Preference Optimization: Your Language Model is Secretly a Reward Model
HW8 out
[ pdf , tex ]
[due May 23]
May 20 Special topic: societal impacts
Guest lecture: Angelina McMillan-Major

ChatGP-Why (Emily M Bender recording)
video
slides
May 22 Special topic: Low-resource / Multilingual NLP
Guest lecture: C.M. Downey
Cross-Lingual Language Model Pretraining

Optional / peruse if interested: When Is Multilinguality a Curse? Language Modeling for 250 High- and Low-Resource Languages
Are All Languages Created Equal in Multilingual BERT?
Emerging Cross-lingual Structure in Pretrained Language Models
On the Cross-lingual Transferability of Monolingual Representations
Bilingual alignment transfers to multilingual alignment for unsupervised parallel text mining
HW9 out
[ pdf , tex ]
[due May 30]
May 27 No Class (Memorial Day)
May 29 Summary / Review / Discussion