How to Choose the Correct Initializer for your Neural Network

preview_player
Показать описание
Vanishing/Exploding Gradients are two of the main problems we face when building neural networks. Before jumping into trying out fixes, it is important to understand what they mean, why they happen and what problems they cause for our neural networks. In this video, we will learn what it means for gradients to vanish or explode and we will take a quick look at what techniques there are in order to deal with vanishing or exploding gradients.

❓To get the most out of the course, don't forget to answer the end of module questions:

👉 You can find the answers here:

RESOURCES:

COURSES:

Рекомендации по теме
Комментарии
Автор

How are there still only less than a thousand views? This series is like one of the greatest things on YouTube.

alreadyghosts
Автор

Belirttiginiz Glorot değişinti formülü Keras'a ait degil ama önemsiz bir ayrinti

bay-bicerdover