Softmax Function in Deep Learning

preview_player
Показать описание
In mathematics, the softmax function, also known as softargmax or normalized exponential function, is a function that takes as input a vector z of K real numbers, and normalizes it into a probability distribution consisting of K probabilities proportional to the exponentials of the input numbers.

If you do have any questions with what we covered in this video then feel free to ask in the comment section below & I'll do my best to answer those.

If you enjoy these tutorials & would like to support them then the easiest way is to simply like the video & give it a thumbs up & also it's a huge help to share these videos with anyone who you think would find them useful.

Please consider clicking the SUBSCRIBE button to be notified for future videos & thank you all for watching.

You can find me on:

#Softmax #DeepLearning
Рекомендации по теме
Комментарии
Автор

The most simplest explanation of softmax function, I have seen so far. Thank you very much Sir.

zeeshanahmed
Автор

Perfect! I recommend this channel to my class fellows! Thanks

mohammadalibalajlamresearc
Автор

God Bless my friend. Thank you for this simple tutorial.

nevin
Автор

very precise and to-the-point tutorial, thanks for the explanation :)

shivamsinghal
Автор

Thank you so much. This was very well explained!

alaishabarber
Автор

May Almighty Allah give u reward for helping students thanks dear sir

mohammadbilalniazi
Автор

Very sweet simple and to the point explanation..Normally I thums down the video if i dont like it...but your are amazing champ... Great delivery. Keep up the good work

anandgoel
Автор

Simple and Clear Explanation!! Thank You :)

vatsal_gamit
Автор

Thanks for the video...these simple concepts helps a lot in visualising a much bigger picture.

utkarshprakash
Автор

What happens under the hood when we run keras dense end layer, with softmax argument?

kurianbenoy
Автор

Cool! One question: why not just pick the value of the array that has the greatest magnitude?

AnsonSavage
Автор

Nicely described Bhavesh. Am learning Deep Learning from Coursera and was going through Logistics Regression, Sigmoid Gradient, Normalization. One of the steps was Softmax Function, was getting confused there. This helped me overcome that. Great work truely. Do you have a playlist that I could refer for Deep Learning Videos?

kurrysamir