154 - Understanding the training and validation loss curves

preview_player
Показать описание
Loss curves contain a lot of information about training of an artificial neural network. This video goes through the interpretation of various loss curves generated using the Wisconsin breast cancer data set.

Code generated in the video can be downloaded from here:
Рекомендации по теме
Комментарии
Автор

This is outstanding. This is the first video to cover an actually useful process for developing a model from scratch in terms of arch decisions. Anyone else know of similar content?

dr_flunks
Автор

The explanation is so clear for my deeper understanding about underfitting and overfitting phenomenon. Thanks! It's really helpful!

satrioyudanto
Автор

This video is so good. The ideas were clearly explained and shown through the graphs. The examples cover a lot of cases you might encounter! I will definitely recommend to others and rewatch this video if I am ever feeling confused. Thank you so much! You are a great teacher!

mk.
Автор

Excellent series that covers a lot of important concepts that many tutorials typically do not cover in great detail.

Messiah-
Автор

Benefited immensely learning about the training and validation curves. Thank you!!

ashutoshshinde
Автор

Thank you very much for the amazing video. This is one of the most important topic that no body talks about!!

farazshaikh
Автор

Thank you very much for the video, it is concise and covers a lot of cases for the learning curves. Exactly what i was looking for!

nazymsatbekova
Автор

Thanks for sharing how to determine the loss curve. very useful and concise.

jerkmeo
Автор

wow!
That is an amusing and clear explanation with justifications, thanks, Dear and keep it up!!

kumala-win
Автор

This was so helpful I was struggling to see the big picture here and now I feel much more equipped! Thanks

Shaans
Автор

I don't know how exactly to thank you. Simply amazing !!!

vaibhavvaibhav
Автор

Thank you for your wonderful clips. Please also teach about epoch and batch size. Thank you

HosseinKianAra
Автор

Great! Very clear and concise, and potentially all pitfalls are well-explained! I would appreciate if you talk about learning rate, batch size and kernel size and their impact on training and validation loss curves. Many thanks!

ozodbekozodov
Автор

Thank you for this helpful video. You are the best teacher!

yizhenwu
Автор

Thank you so much sir. All of your videos really taught me a lot during my journey of completing my FYP!!!

limzisin
Автор

Perfectly Explained... I wish I get teacher like you.

gauthamsk
Автор

Thank you very much sir for the video. You have enlighten me on the model evaluation for neural network. Any advice on optimising the hyperparameters?

limotto
Автор

You mentioned a couple of times to keep changing the random_state in train/test splits and to choose the appropriate model based on their performance to each of these splits. But, doesn't this mean you are leaking information from your test set to train set? This way you'd choose your model while seeing the test set and this may not generalize to other unseen data. I haven't heard all your lectures but probably you are advising to include a separate validation set (train/validation/test) in your split. This would solve this problem and as much as trivial it might sound, it is a big problem in DL in my opinion.
Very nice and informative video by the way, thank you!

ahmadneishabouri
Автор

How nicely explained!
Thank you so much

soumyabrata
Автор

You're the best! Very complex scenarios explained in simple way

guillermoalejandrobastianf