Lesson 10: Deep Learning Part 2 2018 - NLP Classification and Translation

preview_player
Показать описание


Transfer learning has revolutionized computer vision, but until now it largely has failed to make much of an impact in NLP (and to some extent has been simply ignored). In this class we’ll show how pre-training a full language model can greatly surpass previous approaches based on simple word vectors. We’ll use this language model to show a new state of the art result in text classification.
Рекомендации по теме
Комментарии
Автор

I like these original videos a lot better than the ones currently on his website... Those seem to be a lot more general now.

rothn
Автор

First of all great video! Just a small suggestion - in future for loading large data sets (in the get_texts method) you could introduce a more batch-oriented approach. Loading a lot of files in a single loop is likely to cause IOerrors/memory errors and makes students spend additional time and effort trying to omit this problem. It would also provide an example on how to deal with such problems in any other large data set.

ewajuralewicz
Автор

Nice information, it inspiring me. Thanks.

StanleySalvatierra
Автор

Where is rest of the lesson?
Can anyone tell?

ranvijaysahu
Автор

Can i reuse this pretrained-lm as a features extractor for a CNN text classifier?

dannpham
Автор

Great video and very straightforward explanation.

EliasCassab
Автор

Seems like you are using fast.ai library directly instead of tensorflow or keras or pytorch.

ankitaggarwal
Автор

Can someone skip to this video without doing the previous lessons?

luvsuneja
Автор

That fixup() function at 28:57 looks scary!

OttoFazzl