Neural Networks From Scratch - Lec 16 - Summary of all Activation functions in 10 mins

preview_player
Показать описание
Building Neural Networks from scratch in python.
This is the sixteenth video of the course - "Neural Networks From Scratch". This video covers the summary of all the activation functions. we covered the motivation, definition, properties and performance comparison for each of the function.

Neural Networks From Scratch Playlist:

Activation Functions Playlist:

Github Repo:

Step Activation:
Sigmoid Activattion:
Tanh Activation:
Softmax Activation:
ReLU Activation:
Variants of ReLU:
Maxout Activation:
Softplus Activation:
Swish Activation:
Mish Activation:
GeLU Activation:

Please like and subscribe to the channel for more videos. This will help me in assessing your interests and creating more content. Thank you!

Chapter:
0:00 Introduction
0:50 Step Activation
1:54 Sigmoid Activation
3:27 Tanh Activation
4:12 ReLU Activation
5:58 Softplus Activation
6:54 Maxout Activation
8:16 GeLU Activation
9:11 Swish Activation
10:07 Mish Activation
11:07 Softmax Activation

#stepactivationfunction, #sigmoidactivationfunction, #reluactivationfunction, #softmaxactivationfunction, #tanhactivationfunction, #geluactivationfunction, #swishactivationfunction, #mishactivationfunction, #softplusactivationfunction, #activationfunctioninneuralnetwork, #vanishinggradient, #selfgatedactivationfunction, #dropout
Рекомендации по теме
Комментарии
Автор

Is there somewhere I can access the slides in this video? It wound make for a great quick reference

alexanderfortman
visit shbcf.ru