Deep Learning for NLP with Python
Interactive Online Course
- Tentative Start Date: Oct. 10th (if the minimum enrollment limit of 10 is met)
- Saturdays, 10:00 am – 12:00 pm EDT
- Length: 10 weeks (20 hrs)
Important Notes:
- A “Certificate of Completion” will be awarded to you, if you pass the final exam.
- The full price of the course is $130. If you register before Oct. 5th, you can register with $30 off (i.e. discounted price is $100).
- The lectures are in either in Farsi or English, but course material is in English.
- The course is associated with an R&D projects with specific benefits for students.
Course Description
If you know the basic underlying of machine learning and you would like to learn how to run an AI project on time-series data, this is the right place for you to learn the state-of-the-art sequence models. More specifically, we discuss the natural language processing (NLP) in which data is a sequence of words/letters.
In the first part, we learn the basics of NLP and recurrent neural networks (RNNs) which are widely used in sequence modeling for time-series data. Besides a multi-label classification task in Lab 1 with Python, we experience a deep learning implementation for sequence tagging in Lab 2 with PyTorch DL framework. The second part also discusses attention mechanisms and Transformers, which are also state-of-the-art sequence modeling techniques. We learn how to use pre-trained models in a transfer learning fashion for new tasks with small supervised data.
Avoiding from the complex mathematics, this course comprehensibly presents the underlying concepts of the deep learning models for time-series data (RNNs) and their applications. Moreover, we will review attention models. The concepts are supported by Lab session(s) by introducing popular python libraries and implementing some samples.
Course Outline
Part 1
- Introduction (Definitions, Applications, Tasks, Approaches)
- Basics of Linguistics (NLP Pyramid, Vectorizations: BoW, TF-IDF, word2vec)
- Lab 1. Automatic Tagging in Python
- Recurrent Neural Networks (ML Components, LSTM, GRU)
- Lab 2. Named Entity Recognition in PyTorch
Part 2
- Attention mechanisms & Transformers
- BERT
- Lab 3. Spam classification with BERT in PyTorch
- Chatbots & Question/Answering
- Lab 4. Q/A with BERT in PyTorch
Prerequisites
- It is highly suggested to complete the Introduction to AI & ML and Unsupervised Learning courses in advance.
- Basic Python programming is needed. For the lab session, we use Google Colab. So, you don’t need to install anything in your machine. The DL programs employs PyTorch framework.
Resources
The AI & ML Series
DL for NLP with Python is the third course in the AI & ML series. List of courses in the AI & ML boot camp is as follows:
- Introduction to AI & ML
- Unsupervised Learning
- DL for NLP with Python
- Reinforcement Learning
- Convolutional Neural Networks
-
links
-
Section 1. Introduction to NLP
Natural Language Processing (NLP) is an interdisciplinary field of computer science, AI, and linguistics. We will learn NLP applications, tasks, and approaches in this section.
-
Section 2. Basics of Linguistics
We will learn basics of linguistics, including NLP Pyramid, current challenges, and vectorization techniques in this section.
-
Lab 1. Automatic Tagging in Python
Automatic taggong on StackOverflow dataset is a multi-label classification task. We review the implementation in Python here.
-
Section 4. Recurrent Neural Networks
Recurrent Neural Networks (RNNs) are widely used in time-series analysis for sequence modeling. RNNs are utilized in NLP as text is also a type of sequence.
-
Lab 2. NER with LSTM in PyTorch
We use LSTM for seqence tagging, specifically for Named Entity Recognition. We learn how to implemnent the RNN in PyTorch in this lab section.
-
Section 6. Transformers
Transformer is a successful neural network architecture, useful for natural language processing. We will learn the concept of the attention mechanism; then we review the transformer architecture.
-
Section 7. BERT
BERT is a promising language model, applied in many NLP applications. It employs the encoder part of the transformer. We will learn the BERT architecture; then, we learn how to use the pre-trained BERT model.
-
Lab 3. Spam Classification with BERT in PyTorch
As a simple example, in this lab session, we use the pre-trained BERT model for spam classification. The code is in PyTorch.
-
Section 9. Chatbots & Question/Answering Models
Chatbots are widely used for entertainment and marketing. We review different types of chatbots.
-
Lab 4. Q/A with BERT in PyTorch
The Google QUEST Kaggle competition is a Q/A labeling task, required for Q/A chatbots. We review the code for the winner of the competition.
-
0 % In Progress
-
0 % In Progress
-
93.94 % Passed
-
93.94 % Passed
-
96.97 % Passed
-
0 %
0.00 average based on 0 ratings