Lesson 3: Practical Deep Learning for Coders 2022

preview_player
Показать описание
00:00 Introduction and survey
02:25 How to do a fastai lesson
04:28 How to not self-study
05:28 Highest voted student work
07:56 Pets breeds detector
08:52 Paperspace
10:16 JupyterLab
12:11 Make a better pet detector
13:47 Comparison of all (image) models
15:49 Try out new models
19:22 Get the categories of a model
20:40 What’s in the model
21:23 What does model architecture look like
22:15 Parameters of a model
23:36 Create a general quadratic function
27:20 Fit a function by good hands and eyes
30:58 Loss functions
33:39 Automate the search of parameters for better loss
42:45 The mathematical functions
43:18 ReLu: Rectified linear function
45:17 Infinitely complex function
49:21 A chart of all image models compared
52:11 Do I have enough data?
54:56 Interpret gradients in unit?
56:23 Learning rate
1:00:14 Matrix multiplication
1:04:22 Build a regression model in spreadsheet
1:16:18 Build a neuralnet by adding two regression models
1:18:31 Matrix multiplication makes training faster
1:21:01 Watch out! it’s chapter 4
1:22:31 Create dummy variables of 3 classes
1:23:34 Taste NLP
1:27:29 fastai NLP library vs Hugging Face library
1:28:54 Homework to prepare you for the next lesson

Рекомендации по теме
Комментарии
Автор

Wow, this guy is a deep learning/ML genius!

I've been studying deep learning for 2 months now, and I consider myself quite good at math and coding. I've been looking for an explanation of what is happening under the hood when the model is training - an "explain like I'm 5" type of explanation.

But the only things I could find were academic explanations of how a deep neural network trains with matrix multiplication of weight, bias, backpropagation, etc.

I've probably watched 30 videos of those that are all copycats of each other, and I think those people don't know what they are talking about, just spitting out what they saw or read in academic papers/courses.

This video was an eye-opener; the guy really knows what is happening behind the scenes, and his 30 years of expertise in the field really shows in those simple yet very easy-to-understand explanations.

Thank you! 🙏

kentcartridge
Автор

The quadratic section is a beautifully crafted example. Thanks

chronicfantastic
Автор

I greatly appreciate this effort to uplift the community worldwide

TomHutchinson
Автор

I’ve watched so many videos…. Read so many blogs…. Books…. Trying to understand this thing to understand what a neural network is and how it learns— you explained it perfectly making all the words just fit. The meanings become obvious when presented like this, you did this in…. 15 minutes 🔥

orchestra
Автор

The quadratic example was a really good illustration of how gradient descent works - it is really good for building intuition. Then, the Excel example cements the understanding really well with a solid dataset. This is my favourite of the 3 lectures so far.

TheCJD
Автор

I "knew" that deep learning models used the sum of wi +xi + b function, I "knew" that it supposedly was used because it was an "all purpose" function, but now thanks to you Jeremy I know WHY its an "all purpose" function
10/10 explanation. Math should always be explained like this, its actually beautiful to see it all unfold.

manug
Автор

I couldn't understand why ReLu was needed and now I understand. I'm a programmer and I think this is the DL course for me. The explanation is very easy to understand. Thank you!

ucmaster
Автор

Great foundational lecture. Jeremy has a relaxed, non-intimidating approach that works for me. Brilliant step by step walk into the deep end of the pool without getting us lost or scared :) Thank you for taking the time to put this together.

andrespineda
Автор

For those following along, there was a mistake in the spreadsheet range when calculating total loss, both at 1:14:27 and 1:17:40, it selects from row 662 instead row 4. Correct solved loses are 0.144 and 0.143.

acceptapply
Автор

Great lesson!! Jeremy deciding to approach chapter 4 differently after seeing many student quit at this point really shows that he cares about students' learning. Greatly appreciated for the effort!🙏

ed
Автор

many terms i had heard already, like loss function, fitting a model, activation function, relu
JH is Amazing amazing teacher that these things are now clear crystal in my mind
Thank you so much JH

TheBhumbak
Автор

I've gone through many great courses in all sorts of subjects, but I think this course might be the best. Kudos for putting out this fantastic content out there for free for everyone to learn.

dingus
Автор

This is god-tier educational content, sir. Thanks for sharing it!

maraoz
Автор

Probably the most easy to digest material I've seen on the subject, thank you.

Al-yovz
Автор

I am a newbie in machine learning. But the approach, you took in this lesson to explain difficult concepts, is making it so easy to understand. Great work.

duybuidoi
Автор

the explanation of deep learning foundations as is here, is too good! As said by Jeremy, one has to remind oneself, that is it, there is no more.

abdulkadirguner
Автор

I was lucky to have good math teachers in high school. Jeremy explaining the concepts reminded me of them. Thanks.

_ptoni_
Автор

New didactic and methodological ideas - like them very much - still a bit rough in execution - but discovers amazing new territory to approach neural networks - deep learning ... well done!

mrjohn
Автор

The excel example blew my mind. Loved this lesson. Thank you.

allthatyouare
Автор

Thank you so much jeremy for making this course, I am going slow but learning a lot everyday, you are a very patient teacher. Thank you.

DevashishJose