Activation Functions | Deep Learning Tutorial 8 (Tensorflow Tutorial, Keras & Python)

preview_player
Показать описание
Activation functions (step, sigmoid, tanh, relu, leaky relu ) are very important in building a non linear model for a given problem. In this video we will cover different activation functions that are used while building a neural network. We will discuss these functions with their pros and cons,
1) Step
2) Sigmoid
3) tanh
4) ReLU (rectified linear unit)
5) Leaky ReLU
We will also write python code to implement these functions and see how they behave for sample inputes.

🔖 Hashtags 🔖
#activationfunction #activationfunctionneuralnetwork #neuralnetwork #deeplearning

Prerequisites for this series:   

#️⃣ Social Media #️⃣
Рекомендации по теме
Комментарии
Автор

Number of views is not doing justice to the quality of content that is created. Learning two weeks content in hardly 20 minutes. Thanks

arslanMCL
Автор

Very well articulated, I searched the whole web, nobody explained these concepts in such simple way, without any confusion!!! Thank you

abvijaykumar
Автор

Very structured and organic build up of concepts, not throwing a bunch in a short timeframe down your throat praying you gobble it up. I appreciate your hard work behind the animations too.Keep it up!

ansumansatpathy
Автор

Hats off, I am a PhD student, and I worked on NLP, ML and text analytics, in the last semester of my PhD I am turning to deep learning for my post doc research, and I needed background information on deep learning. Also in my last project somehow I managed to apply deep learning simple classifier, but that instinct to theoretically and technically understand background of deep learning was missing. I read articles, videos etc. a lot but man your videos on deep learning concept is really fulfilling my instinct up till now. Hats off to you Bro. Thank you for your vision of education and these helpful tutorials.

JunaidInHenan
Автор

I do get valuable information from youtube now and then. However, I did not expect deep learning tutorials to be explained in such simplicity yet highly informative as well. Machine Learning and Deep Learning videos on this channel are highly recommended.
Thank you for such contents.

MrBasu-iqmd
Автор

sir what ah explanation, it seems so easy to learn deep learning, carry on your winning momentum, hope you become one of the great teachers in data science🔥🔥🔥🔥🔥🔥🔥🔥🔥

karthikc
Автор

In University, I read these functions with 0 knowledge of its practical implementation. Now I have clear concept. Thanks codebasics❤❤

lifefact
Автор

I am watching your video since beginning. All is amazing Sirji.

SKumar-Munna
Автор

Sir, very nicely and in a simple way you are explaining all the things. Thanks

PriyaCSEAI
Автор

You are awesome. Complex topic explained so clearly that they just stick to brain. These lectures are of the highest quality. Thank you for sharing your knowledge and for free!

shivam
Автор

I really loved the easy explanation given by your sir. I wish I'd found this series earlier, but will watch the series from now on. Thank you for your efforts.

shreyakapadia
Автор

00:00 Activation functions are necessary in neural networks
02:04 Activation functions are necessary for building non-linear equations in neural networks.
04:06 Step function and sigmoid function are activation functions used in classification
05:57 Use sigmoid function in the output layer and 10h function in all other places.
07:54 Derivatives and the problem of vanishing gradients
10:02 The most popular activation function for hidden layers is the sigmoid function.
12:04 Sigmoid and tanh functions are used to convert values into a range of 0 to 1 or -1 to 1 respectively.
14:30 Positive values remain the same, negative values become zero, leaky value function multiplies input by 0.1

soubhikghoshroy
Автор

There is no better explanation i've come across when it comes to Data science/ Machine Learning/Deep Learning, it's a shame that big e-learning companies like edureka are just copying content as mentioned by dhaval sir in one of his recent videos.

goodwork
Автор

i am getting more attracted to words deep learning by ur explaination wt a explaination great sir hatsoff

umarnadaf
Автор

wow this channel has a lot of crucial content. relu activation decreased my loss value from 0.04 to 0.003 even with half of training data!

sayochikun
Автор

Thank you very much! Such a great explanation. Thank you for explaining the pitfalls in the activation functions - it is for the first time I hear them.

nikolaynedelchev
Автор

Thank you once again for making Machine Learning simple...God bless.

ChristopherUOnova
Автор

Hi Sir, Amazing, I watched many videos in YouTube regarding Deep Learning or Data science, But i failed to find this type of help, lectures, mentoring. Hats off. A bundle of thanks and prayers for you from Rizwan (Pakistan). Keep it up.

muhammadrizwan
Автор

sir, you give a good concept of deep learning. Sir i am beginner and one my friend refer your deep learning lectures when i started your lectures i learn so much from it. Sir keep it up for future, thank you sir again..

mytechwork
Автор

Your videos are excellent. Your words and diagrams really help clarify the process. I have recommended your videos to fellow colleagues. Bravo 👍

jas
visit shbcf.ru