The Softmax : Data Science Basics

preview_player
Показать описание
All about the SOFTMAX function in machine learning!
Рекомендации по теме
Комментарии
Автор

I really love how you progress step by step instead of directly throwing out the formulas! The best video on YouTube on the Softmax! +1

wennie
Автор

tutorials with boards noww...nice one dude...underrated channel I must say!

birajkoirala
Автор

For a non-mathematician like myself, this was crystal clear, thanks very much!

DFCinBE
Автор

This is excellent! I saw your video on the sigmoid function and both of these explain the why behind their usage.

debapriyabanerjee
Автор

Thank you!!! This is so much clearer and straighter than 2 20-minutes videos on Softmax from "Machine Learning with Python-From Linear Models to Deep Learning" from MIT! To be fair, the latter explains multiple perspectives and is also good in its sense. But you deliver just the most importaint first bit of what is softmax and what are all these terms are about.

ekaterinakorneeva
Автор

What a great explanation! Thank you very much.

The why do we choose this formula versus this formula explanation is truly makes everything clear. Thank you once again :)

iraklisalia
Автор

Awesome stuff. Searched this video because I was trying to figure out why the scores/sum scores approach wouldn't work and you addressed it first thing. Great job.

marcusakiti
Автор

the person who is going to be responsible for me kick starting my ML journey with a good head on my shoulders, thank you ritvik, very enlightening

omniscienceisdead
Автор

The introduction to softmax which explains why softmax exists helped me a lot understanding it

MOREClay
Автор

The only video I needed to understand the SOFTMAX function. Kudos to you!!

ManpreetKaur-vegw
Автор

Great explenations, your addition of the story to the objects really help understanding the material

zvithaler
Автор

Thnx. Very clear explanation of the rationale for employing exponential fns instead of linear fns

okeuwechue
Автор

Another great content, thank you so much!

maralazizi
Автор

An excellent and straightforward way of explaining. So helpful! Thanks a lot :)

karimamakhlouf
Автор

Thank you so much. I now understand why exp is used instead of simple calc.😊

MTech-DataScience
Автор

Wow...teaching from first principles...I love that!

somteezle
Автор

I like the hierarchy implied by the indices on the S vector ;)

michael
Автор

Woooow, really liked our teaching approach, awesome!

zafarnasim
Автор

please note that the outputs of Softmax are NOT probabilities but are interpreted as probabilities. This is an important distinction! The same goes for the Sigmoid function. Thanks

MLDawn
Автор

Now i know why lot of your videos answers WHY question. You give importance to application not the theory alone. concept is very clear. thanks

vamshi