L-2 Activation Functions in Deep Learning

preview_player
Показать описание
Learn about different Activation functions-
1- Unit Step or Binary Activation Function
2- Sigmoid or Logistic Activation Function
3- ReLU Activation Function
4- TanH Activation Function
5- Softmax Activation Function

Join this channel to get access to perks:
Рекомендации по теме
Комментарии
Автор

thank you for providing such in information but i have one note in slide no 3 the value of x in both equation is less than the threshold x>0 and 0<=x it is written like this can you explain please

JwanKAlwan
Автор

in unit / binary step activation slide there is a error represented as the threshold 0>x and x<=0
the correct is the threshold0>x and x>=0

ghantaharshith
Автор

Is sigmoid (output of 0-1) as an analogue output, is this not equivelant to normalising data or a percentage of the input

devilzwishbone
Автор

Mam you said that sigmoid function have value between 0-1 but for sigmoid(-10) It had value grater than 1 why?

MATHSADDAU
Автор

Mam I watched so many deep learning courses but your teaching is amazing from all of them.

engineerweeb
Автор

In slide 4 the function table is incorrect for -(ve) value. Overall Very good teaching video.

dipankarbarman
Автор

Your teaching is so good ! Other tutorials from other channels are not as clear as yours ! Thank you 🙏🏻🙏🏻🙏🏻

exoticcoder
Автор

Maam can you please share the slides ?

hanae.health
Автор

why leaky relu and Elu is not part of this video

anuragshrivastava