Neural Networks Part 5: ArgMax and SoftMax

preview_player
Показать описание
When your Neural Network has more than one output, then it is very common to train with SoftMax and, once trained, swap SoftMax out for ArgMax. This video give you all the details on these two methods so that you'll know when and why to use ArgMax or SoftMax.

NOTE: This StatQuest assumes that you already understand:

For a complete index of all the StatQuest videos, check out:

If you'd like to support StatQuest, please consider...

Buying my book, The StatQuest Illustrated Guide to Machine Learning:

...or...

...a cool StatQuest t-shirt or sweatshirt:

...buying one or two of my songs (or go large and get a whole album!)

...or just donating to StatQuest!

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:

0:00 Awesome song and introduction
2:02 ArgMax
4:21 SoftMax
6:36 SoftMax properties
9:31 SoftMax general equation
10:20 SoftMax derivatives

#StatQuest #NeuralNetworks #ArgMax #SoftMax
Рекомендации по теме
Комментарии
Автор

Can't wait for "cross entropy cleary explained!" BAM!

mrglootie
Автор

universities offering AI/ML programs should just hire a program manager to sort and prioritize Josh Starmer's YT videos and organize exams

AndruXa
Автор

The video is so impressive especially when you explain why we can't treat the output of softmax as a simple probability. Best tutorial ever for all the explanations in ML!!!

cara
Автор

Your videos are awesome! Thank you for doing them and continue with the great work! 👍

AlbertHerrandoMoraira
Автор

Thank you! This is by far the clearest explanation of SoftMax I've found. I finally get it!

bryanaero
Автор

I just want to say that YOU are awesome. Best educational content on the web hands down.

pulse
Автор

Your way of explaining things made me subscribe you. Love to see topics explained in a simple yet funny way. Keep up the great work. And also.... *BAM*

karansaxena
Автор

No words for you man, you are doing a very great, and I totally fall in love with your music and way you teach, love from india❤️

Aman-ukfw
Автор

nice touch at the end. I didn't realise the use for ArgMax until you said it's nice for classifying new observations

iReaperYo
Автор

Sir the way you teach is exceptionally creative
thanks to you, my deep learning exam went well

factsfigures
Автор

Hi Josh,
Your explanations are super awesome!!! You ruin barriers for statistics!!! Also they are super creative :). Many Thanks! Please keep it up. Thanks again. BAM!!!

ishanbuddhika
Автор

Thanks Josh for the crystal clear explanation.

aswink
Автор

this is all so well explained! just wow!

lucarauchenberger
Автор

Hey Josh, needless to say, your videos and tutorials are amazingly fun! Can you please create an video-series on Shapley values! Those are widely used in practise.

menchenkenner
Автор

Your videos have been extremely helpful, thank you so much!!

NicholasHeeralal
Автор

Excellent vedio. Thank you for explaining so well.

haadialiaqat
Автор

you deserve a professor tittle!!! Fantastic

drccccccccc
Автор

Just bought your book ! it's AMAZING !!! your videos too :)

coralkuta
Автор

bonjour JOSH
merci beaucoup pour cette belle explication.

faycalzaidi
Автор

Thousand thanks for the explanation! Your explanation is much easier to understand, comparing to my lecturers! Could you make some videos about cost function? :)

patriciachang