Lesson 3 - Deep Learning for Coders (2020)

preview_player
Показать описание

Today we finish creating and deploying our own app. We discuss data augmentation, and look at the most important types of augmentation used in modern computer vision models. We also see how fastai helps you process your images to get them ready for your model.

We look at building GUIs, both for interactive apps inside notebooks, and also for standalone web applications. We discuss how to deploy web applications that incorporate deep learning models. In doing so, we look at the pros and cons of different approaches, such as server-based and edge-device deployment.

Our final area for productionization is looking at what can go wrong, and how to avoid problems, and keep your data product working effectively in practice.

Then we skip over to chapter 4 of the book, and learn about the underlying math and code of Stochastic Gradient Descent, which lies at the heart of neural network training.

0:00 - Recap of Lesson 2 + What's next
1:08 - Resizing Images with DataBlock
8:46 - Data Augmentation and item_tfms vs batch_tfms
12:28 - Training your model, and using it to clean your data
18:07 - Turning your model into an online application
36:12 - Deploying to a mobile phone
38:13 - How to avoid disaster
50:59 - Unforeseen consequences and feedback loops
57:20 - End of Chapter 2 Recap + Blogging
1:04:09 - Starting MNIST from scratch
1:06:58 - untar_data and path explained
1:10:57 - Exploring at the MNIST data
1:12:05 - NumPy Array vs PyTorch Tensor
1:16:00 - Creating a simple baseline model
1:28:38 - Working with arrays and tensors
1:30:50 - Computing metrics with Broadcasting
1:39:46 - Stochastic Gradient Descent (SGD)
1:54:40 - End-to-end Gradient Descent example
2:01:56 - MNIST loss function
2:04:40 - Lesson 3 review
Рекомендации по теме
Комментарии
Автор

0:00 - Recap of Lesson 2 + What's next
1:08 - Resizing Images with DataBlock
8:46 - Data Augmentation and item_tfms vs batch_tfms
12:28 - Training your model, and using it to clean your data
18:07 - Turning your model into an online application
36:12 - Deploying to a mobile phone
38:13 - How to avoid disaster
50:59 - Unforeseen consequences and feedback loops
57:20 - End of Chapter 2 Recap + Blogging
1:04:09 - Starting MNIST from scratch
1:06:58 - untar_data and path explained
1:10:57 - Exploring at the MNIST data
1:12:05 - NumPy Array vs PyTorch Tensor
1:16:00 - Creating a simple baseline model
1:28:38 - Working with arrays and tensors
1:30:50 - Computing metrics with Broadcasting
1:39:46 - Stochastic Gradient Descent (SGD)
1:54:40 - End-to-end Gradient Descent example
2:01:56 - MNIST loss function
2:04:40 - Lesson 3 review

vulnerablegrowth
Автор

Warning!!! The pickle module is not secure. Only unpickle data you trust.

It is possible to construct malicious pickle data which will execute arbitrary code during unpickling. Never unpickle data that could have come from an untrusted source, or that could have been tampered with.

Safer serialization formats such as json may be more appropriate if you are processing untrusted data.

mendelovitch
Автор

Mr. Jeremy Howard thank you so much for doing this. We (code developers) really need this kind of approach to the ML and AI subject.

_nikolanenadovic
Автор

You’ve outdone yourself Jeremy, loving the course!

MaxwellMcKinnon
Автор

Absolutely amazing stuff, thank you so much Mr Jeremy Howard from a french radiologist

tamvan
Автор

PIL. Short description f Python Imaging Library (PIL) 1:11:35
Practical usage of lists for image processing: 1:18:48

curumo_curunir
Автор

Mr. Jeremy you are a genius at explaining things, thank you for making this course!

adhoc
Автор

1:16:04 there is a typo, "The entire image contains 28 pixels across and 28 pixels down, for a total of 768 pixels"
28*28 =784
for some models that it needs to be vectorized may be an issue.
Thanks for this great course.

sinaasadiyan
Автор

Great course, great explanations of the conceptual intuition of how ML models work!

lucaseichhorn
Автор

14:49 i'll call it a very cute good boi ʕ•ᴥ•ʔ . I would definitely keep it.

makxell
Автор

Can someone explain how is python allowing arguments like (path/"images") in their functions? I have never come across functions like this.

DrVivekKarn
Автор

at 1:58:57 why do we do loss.backward() isn't this get us grad derivative of mse lost function and not of our f() that we are referring to?

jiykauu
Автор

Hi, does anyone know why the function on which backward is calculated needs a scalar output, even though xt.grad gives a tensor?

i.e why wont def f(x) : return x**2 work?

Thanks for the great lecture Jeremy!

rshreyas
Автор

Sir, when you do randomresizecrop, does that mean each image is resized and cropped in four different ways ? so the total number of images seen by the CNN is 4 times the number of initially downloaded images? because i am seeing at 7:54 that the same image has 4 diff versions. if so, how do we set there to have 5 or 6 diff versions instead of 4?

shaythuramelangkovan
Автор

As far as I know, Newton's Method involves the second-order derivative to find the root of a function. But this here is solely gradient descent. Isn't it wrong to say "This idea is called Newton's Method", at this time 1:47:56 ?

utkumertt
Автор

The CC-transcript has typos. How does one report or suggest edits there?

SrEngr
Автор

1:43:30 How does the function def pr_eight(x, w): return (x*w).sum() , predict whether its a 7 or an 8?

naveenperera
Автор

Why doesn’t Rachel appear in the screen when she intervenes?!? 😎

cag
Автор

Is there a way to remove the "um"'s? Maybe there is a machine learning model that can do that.

dglebable
Автор

The vocal fry and upspeak is so painful to listen to.

CyberPsyLen