What is Activation function in Neural Network ? Types of Activation Function in Neural Network

preview_player
Показать описание
In this video, we will see What is Activation Function in Neural network, types of Activation function in Neural Network, why to use an Activation Function and which Activation function to use.

The most commonly used Activation Functions in neural network are :

1.) Sigmoid Function
2.) tanh activation function
3.) ReLU activation function
4.) Softmax activation function

All the activation function have some advantages and disadvantages of their own.

We use Sigmoid function at the output neuron of the binary classification. While we use softmax activation function at output layer for multi-class classification.

For hidden layers, it is not appropriate to use sigmoid function, so we use tanh or ReLU activation function for hidden layers.

Timestamps :
0:00 - Video Agenda
0:30 - Why do we need activation function
2:08 - Sigmoid Function
3:12 - Tanh Activation Function
5:15 - ReLU Activation Function
7:01 - Softmax Activation Function
9:15 - When not to use any Activation Function
9:53 - Summary

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

This is Your Lane to Machine Learning ⭐

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Рекомендации по теме
Комментарии
Автор

If you found this video helpful, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.

CodingLane
Автор

You explain these concepts more completely and simply than any other video I’ve seen. Thank you

masplacasmaschicas
Автор

Just started this playlist and found it very well explained. Thank you for great work.

tiyyob
Автор

Thanks a lot for This Amazing Introductory Lecture 😀
Lecture - 3 Completed from This Neural Network Playlist

PrithaMajumder
Автор

damn, this is lowkey a really good and insightful way of explaining this. I'll be sharing with my students. Exceptional tutorial

GX
Автор

Well and brief explanation of the activation functions Sir Patel, wonderful, I am acquiring new knowledge from your every videos, good, great going

brindhasenthilkumar
Автор

You are saving me rn from my midtrem tomorrow. Thank you!!!'

rutvipatel
Автор

Very underrated channel. Great explanation!

petcumircea
Автор

very very very useful for me. Thank you

maheshyezarla
Автор

thanks a lot for sharing! really helped me understanding why and when using which activation function. Very good!

chillax
Автор

First of all, thank you very much for these videos. I have a question about cross entropy. I understand how cross entropy works. I don't understand why it works. I would appreciate it if you make videos about these topics.

ulmwfue
Автор

Perfect explanation. thank you. keep going

dgmnzex
Автор

very well explained, thanks so much for the video

teozhisen
Автор

will this explanation be enough for a beginnner in ML? I understood what you have explained .iam learnign from you .Thank you.

aienImchen-hsfp
Автор

Amazing Explanation, just one mistake at 10:16 to 10:24 that should be "Sigmoid and TanH" not "ReLU and TanH"...

priyanshupatelhawk
Автор

how does relu solve the vanishing gradient problem since some part of the gradient is zero for x < 0?

algorithmo
Автор

@10:18 woudn't it be "both the tanh and sigmoid function (and not 'Relu') had this disadvantage of vanishing gradient prob..."... Relu is it's solution right?

Ivaan_reminiscence
Автор

Does relu makes the f(x)=0 even if the x is very small but >0? because tanh/sigmoid the rate of change of gradient becomes very small but still >0, whereas in the relu the f(x) seems to be 0 only when x<=0

Ivaan_reminiscence
Автор

Is it possible for you to add/share further reading documents ?

ArchitStark
Автор

which one is a non-symmetric activation function ?

susw