Why Initialize a Neural Network with Random Weights || Quick Explained

preview_player
Показать описание
What if you initialize weight in Neural Network with zeros.
What will be the consequences or why we initialize weights randomly?
In this video, I've explained why it's important to initialize weights randomly (from normal distribution) so that a Network can't form symmetry.

This video includes explanations as well as experimental proof about what will happen if initial weight will be zero.

Stay tuned:

Thanks.
Рекомендации по теме
Комментарии
Автор

Please leave feedback if you can. It means a lot to me

DevelopersHutt
Автор

This channel is clearly underrated.. Kudos bro..

MsPOOJA
Автор

You are doing an amazing job. This way of first explaining the concept and then showing its implementation via proper code is superb!! Keep growing.

karangoyal
Автор

Great! you're single video was enough to break the symmetry

vipingautam
Автор

Simple and Wonderful Explanation. Great Thanks!

eleanortay
Автор

Thanks alot brother.. love you lot.. keep going.. ❤️❤️❤️

mandarchincholkar
Автор

Make Playlist for building neural network from scratch. Without using any library

neerajverma
Автор

Low curvature initialization is a good idea. Random initialization is very high curvature and it is very difficult to smooth that out.

hoaxuan
Автор

Thanks for your clear explanation.
Here I have a question about activation functions:
In case we are doing a non-linear regression and dealing with a "PDE of 2-3 degrees of derivatives", can we use a non-differentiable activation function such as ReLU, or we should necessarily use an infinitely differentiable one like Tanh?

AmirhosseinKhademi-ings
Автор

Off topic: There is a thing called Fast Transform fixed filter bank neural nets. I can't go into details because that leads to comment removal.

hoaxuan