Sign in to confirm you’re not a bot
This helps protect our community. Learn more

Review of Last Week

0:18

Segmentation

18:12

Feature Pyramids

18:34

Nlp

18:57

Basic Paths for Nlp

21:28

Train Test Split

27:22

Tokenization

28:05

Building a Language Model on Wikipedia

43:45

Create an Embedding Matrix

48:07

Averaging the Weights of Embeddings

1:01:24

Language Model

1:11:36

Edit Encoder

1:16:49

Regularizing and Optimizing Lsdm Language Models

1:18:07

Tie Weights

1:19:10

Measure Accuracy

1:21:56

What Is Your Ratio of Paper Reading versus Coding in a Week

1:25:26

Universal Sentence Encoder

1:28:59

Add More than One Hidden Layer

1:39:38

Learning Rate

1:49:48

Concat Pooling

1:51:57

Trick Number Two Is To Create Python Scripts

2:01:26

Imdb Scripts

2:02:55
Lesson 10: Deep Learning Part 2 2018 - NLP Classification and Translation
257Likes
41,776Views
2018May 7
NB: Please go to http://course.fast.ai/part2.html to view this video since there is important updated information there. If you have questions, use the forums at http://forums.fast.ai. After reviewing what we’ve learned about object detection, today we jump into NLP, starting with an introduction to the new fastai.text library. This is a replacement for torchtext which is faster and more flexible in many situations. A lot of today’s class will be very familiar—we’re covering a lot of the same ground as lesson 4. But today’s lesson will show you how to get much more accurate results, by using transfer learning for NLP. Transfer learning has revolutionized computer vision, but until now it largely has failed to make much of an impact in NLP (and to some extent has been simply ignored). In this class we’ll show how pre-training a full language model can greatly surpass previous approaches based on simple word vectors. We’ll use this language model to show a new state of the art result in text classification.

Follow along using the transcript.

Jeremy Howard

133K subscribers