How does Batch Normalization really works? [Lecture 6.3]

preview_player
Показать описание
What is the idea of batch normalization? How can batch normlization stabelize training of deep neural networks?

Batch normalization bases on a similar idea than normalization in data preprocessing. Basically, batch normalization is a layer standardization with mean and variance for a mini batch. Including batch normalization requires a modification of the backpropagation algorithm. The effect of batch normalization is a regularization to train deep neural networks more efficiently.
Рекомендации по теме