Dropout Regularization | Deep Learning Tutorial 20 (Tensorflow2.0, Keras & Python)

preview_player
Показать описание
Overfitting and underfitting are common phenomena in the field of machine learning and the techniques used to tackle overfitting problem is called regularization. In deep learning, dropout regularization is used to randomly drop neurons from hidden layers and this helps with generalization. In this video, we will see a theory behind dropout regularization. We will then implement artificial neural network for binary classification problem and see how using dropout layer can increase the performance of the model.

#dropoutregularization #dropoutregularizationtechnique #dropoutregularisation #deeplearning #deeplearningtutorial #dropoutdeeplearning

Prerequisites for this series:   

DISCLAIMER: All opinions expressed in this video are of my own and not that of my employers'.
Рекомендации по теме
Комментарии
Автор

I'm at a loss for words to express my gratitude towards you. your tutorial is amazing, thank you so much

codinghighlightswithsadra
Автор

Sky has no limits u r teaching has no more questions.Those who are dislike they are like finding the color of water, run the car with out petrol

shaiksuleman
Автор

such a funny analogy at the beginning! You are true genius educator :D :D

geekyprogrammer
Автор

Sir, Your way of teaching is awesome.Sir, Please do videos on Multi-class classification problem in deep learning.

clementvirgeniya
Автор

and here I was waiting dropout regularization to happen for you to delete dense layers #2 and #3 !! hahaha. Great stuff. Keep up the good work.

manideep
Автор

I am enjoying your tutorials. Thank you so much.

jongcheulkim
Автор

Thank you for this amazing tutorial! I even understood the batch size, without this being my goal with this video here! <3

achelias
Автор

Sir, your explanation is great great great.
But, sir please make video on this series fast so, as our exam come near we prepare well and we complete in less time.
Thanks a lot for making such a good videos.

pa
Автор

Sir, can you explain why dropping neurons to 50% isn't as the same as reducing the neuron size to 50%? For example, instead of taking 60 neurons and dropping 50%, why don't we just take 30 neurons to begin with?
Thanks in advance

very_nice_
Автор

Thank you. Nicely explained with a clear-cut example.

sandiproy
Автор

Sir Please cover the concept of EARLY STOPPING... I know the implementation part but want to know in-depth.

hardikvegad
Автор

Awsome Sir, Thakyou so much for making us understand such important concepts in simple n easy way..!!!

MrSHANKSHINE
Автор

Hi,
In Deep learning, can you Please post some videos on hyper parameter tuning.
Thanks

balajiplatinum
Автор

El accuracy me salio casi igual, sin embargo, agradesco el video

iaconst.
Автор

Great tutorial, love the biryani example 😂😂

fahadreda
Автор

9:10 can we replace M and R with 0 and 1 instead of using dummy variable ??

vishaltanawade
Автор

thank you for your tutorial. I have learned much from it

ncf
Автор

05:10 so effectively a drop out could be considered similarly then to test/train data, in that it trains neurons A and C, then adjusts B and D based on the test results from A and C

devilzwishbone
Автор

Sir, in my dataset i am having 20 target variables(ie., multi- class problem). When i train and test my accuracy it is only 45%. I am little bit struck with this. It will be helpfull if you give me some suggestions .

clementvirgeniya
Автор

HI Sir, I have a question about the droupout technique, as we can see this technique randomly deactivate the neuron, what about the testing, is it still deactivated ?

abdansyakura