Neural Networks From Scratch - Lec 9 - ReLU Activation Function

preview_player
Показать описание
Building Neural Networks from scratch in python.
This is the ninth video of the course - "Neural Networks From Scratch". This video covers the ReLU activation function and it's importance. We discuss the advantages of using ReLU activation function and its drawbacks. finally we discussed some tips to follow when using ReLU function in the neural networks. Also we saw the python implementation.

Course Playlist:

Sparsity in Machine Learning:

Please like and subscribe to the channel for more videos. This will help me in assessing the interests and creating more content. Thank you!

Chapter:
0:00 Introduction
0:38 Vanishing Gradient Problem
1:11 ReLU Activation function
2:07 ReLU function behaviour
2:55 Sine wave approximation using ReLU
3:20 Derivative of ReLU
3:46 Dying ReLU Problem
4:29 Advantages
6:33 Drawbacks
7:37 Tips for using ReLU
9:05 Python Implementation

#reluactivationfunction, #reluactivationfunctioninneuralnetwork, #sigmoidactivationfunction, #activationfunctioninneuralnetwork, #vanishinggradient
Рекомендации по теме
Комментарии
Автор

I wonder why there is not even a single comment for this video. This is by far one of the most simplest and yet clear explanation of Relu activation I've come across in internet. I've started following your lectures from the first one. Amazing work. Keep it up. You'll soon get the appreciation you truly deserve.

SwiftMind
Автор

I saw this video as a suggestion for MITs Micromasters track in Stats and data science. Thanks for making this.

et