My NLP Curriculum for July 2020

This post is inspired by my earlier post on the same topic.

I prepared my reading curriculum for the month of June and it helped me to track progress and as a guide to focus on specific topic for a month. Here’s the link to my earlier post.

What I learnt from having a curriculum

  1. It helps you to be focussed on one thing for a month.
  2. Because there’s timeline, you tend to spend time efficiently.

What didn’t go well in June 2020

  1. I overestimated myself and couldn’t complete everything I had planned.
  2. I couldn’t decide If I wanted to skim through topics or go deep in one topic or a course.

Based on my experience from the last curriculum, I’ve decided on the following list for July 2020. I have taken most of the material from June itself.

Articles / Posts

  • Outline of NLP – Wikipedia article which outlines different concepts in NLP like Prerequisite Technology, Subfields, Process, History etc. This should be read once in a while to understand where we stand in terms of our understanding.
  • NLP – Wikipedia article that talks more about different problems that are solved with the help of NLP

Project

Course

Stanford CS224N: NLP with Deep Learning | Winter 2019 – Stanford’s course on NLP taught by Christopher Manning. I spent most of June on this course.

Each lecture in itself is very deep in fundamentals. It requires extra reading on similar concepts.

I would want to spend most of the time completing this course and understanding the basics of Natural Language Processing using Deep Learning.

List of lectures, I want to complete in July,

  1. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 1 – Introduction & Word Vectors [Done]
  2. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 2 – Word Vectors & Word Senses [Done]
  3. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 3 – Neural Networks
  4. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 4 – Backpropagation
  5. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 5 – Dependency Parsing
  6. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 6 – Language Models & RNNs
  7. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 7 – Vanishing Gradients & Fancy RNNs
  8. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 8 – Translation seq2seq, Attention
  9. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 9 – Practical Tips for Projects
  10. Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 10 – Question Answering

How I’m planning to Read ?

Reading and practicing for 1.5 – 2 hours every day.

On weekends, I would spend time in learning about related topics for the course and working on projects.

If you’ve got any questions, or just want to say Hi, feel free to talk to me