CNN Training with Code Example - Neural Network Programming Course

preview_player
Показать описание
In this episode, we discuss the training process in general and show how to train a CNN with PyTorch.

🕒🦎 VIDEO SECTIONS 🦎🕒

00:30 Help deeplizard add video timestamps - See example in the description
17:48 Collective Intelligence and the DEEPLIZARD HIVEMIND

💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥

👋 Hey, we're Chris and Mandy, the creators of deeplizard!

👉 Check out the website for more learning material:

💻 ENROLL TO GET DOWNLOAD ACCESS TO CODE FILES

🧠 Support collective intelligence, join the deeplizard hivemind:

🧠 Use code DEEPLIZARD at checkout to receive 15% off your first Neurohacker order
👉 Use your receipt from Neurohacker to get a discount on deeplizard courses

👀 CHECK OUT OUR VLOG:

❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind:
Tammy
Mano Prime
Ling Li

🚀 Boost collective intelligence by sharing this video on social media!

👀 Follow deeplizard:

🎓 Deep Learning with deeplizard:

🎓 Other Courses:

🛒 Check out products deeplizard recommends on Amazon:

🎵 deeplizard uses music by Kevin MacLeod

❤️ Please use the knowledge gained from deeplizard content for good, not evil.
Рекомендации по теме
Комментарии
Автор

Check out the corresponding blog and other resources for this video at:

deeplizard
Автор

The only channel I pause ad-block for.

CaptainBravo
Автор

You are a talented teacher, you explain things very, very well. I wish most ML channels were like that.

tallwaters
Автор

This is the best series on Pytorch..! Everything is so detailed. Just amazing..! 😊

tusharkalyani
Автор

There is only one unique thing I don't understand: WHY SO FEW VIEWS AND LIKES. This is just awesome work.
Really Thank you

rounhi
Автор

THank you, you have explained it very clearly, good for beginners like me:)

anryxas
Автор

Hi there. As always, flawless! Please upload the next one! Can't wait to learn the next step

MLDawn
Автор

I finally made a switch from keras to pyTorch because after watching this series, PyTorch seemed more intuitive to me comparatively!
Some time ago after completing the theory course on deep learning, I also learned Keras from this channel only.
Thanks for everything, man!
waiting for the next one.

tarangranpara
Автор

Awesome explanation. Reasoning the existence of each and every step is a beautiful way to explain things.
thank you so much for putting time and effort to create this beautiful tutorial.

shubhamshah
Автор

man you are so great teacher wanna of the best serie i have never seen this and rl with pytorch thank you so much!!!

workhard
Автор

It couldn't be better than this. Please keep doing this. <3

saeedghorbani
Автор

There are many things I'm not shure of, like is there a God? or what is your favorite MK caracter between D'Vorah or Reptile? But these are definitely the best pytorch tutorials on youtube.

danielteixeira
Автор

What I learned:
1、What happened in one loop.
2、get batch, propagate through the network get the predict. Calculate the loss of predict and loss. Update the weight.
3、pytorch doing the heavy lift. Calculate the loss and update the weigh

tingnews
Автор

Please do a video on embeddings in Keras! Love your videos so much!

wengeance
Автор

thank you for the excellent video, loss.backward() operation is amazing. I like it.

boxu
Автор

I am still very confused in how a whole batch is passed and how its trained? For example, the shape of images will be [100, 1, 28, 28]. Indicating there will be 100 images. However, how does the network deal with this batch? Does the class itself just deal with it or what?? Is there gonna be 100 calls on the network and then there weights are averaged?

prempant
Автор

Excellent series, the best framework tutorial I have ever seen on Youtube so far. Do you plan to create an episode about making dataset & dataloader from private dataset? Another question is, is it possible to do extra pre-processing after loading a batch of images? (for e.g., the image is large, you want to crop it into smaller images before feeding it to the network. However, since the distributions within the single image (at different locations) are different, you need to crop it after loading a batch of original images so that the batch normalization still makes sense) Thank you

yunhuaji
Автор

I have one question for the above lecture


It was explained in preceding lectures that cross_entropy() calls softmax function


preds=network(images) # pass batch
loss=F.cross_entropy(preds, labels) #calculate loss


So the output of preds would be a tensor of shape (100, 10) of probabilities


by passing preds to cross_entropy we would get a tensor of probabilities which would sum to 1 as softmax operates on it


But the labels here are argmax( preds ) so is the loss computed above correct or I am missing something

sumeetsawant
Автор

Hey, my question is how loss.backwards() and optimizer.step() are related. I don't see how values calculated from backwards() that are stored in loss can be passed to optimizer (optimizer can update weights but doesn't know anything related to loss). How exactly is it accomplished? Something more natural for me would be optimizer.step(loss). I also don't see how gradients shape can be updated for network when doing loss.backwards(), as loss object was created without using network object. Nevertheless, awesome job!!!

mikoajgrzywacz
Автор

Thank you very much for this series! It's helping me so much, you have no idea. I have 1 question tho, why do you call the loss.backward() method only once? Next to that you calculated the preds tensor and the loss tensor again, but you didn't call the backward method. Why?

christiancruvinelfranca