Neural Networks Part 6: Cross Entropy

preview_player
Показать описание
When a Neural Network is used for classification, we usually evaluate how well it fits the data with Cross Entropy. This StatQuest gives you and overview of how to calculate Cross Entropy and Total Cross Entropy.

NOTE: This StatQuest assumes that you are already familiar with...

For a complete index of all the StatQuest videos, check out:

If you'd like to support StatQuest, please consider...

Buying my book, The StatQuest Illustrated Guide to Machine Learning:

...or...

...a cool StatQuest t-shirt or sweatshirt:

...buying one or two of my songs (or go large and get a whole album!)

...or just donating to StatQuest!

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:

0:00 Awesome song and introduction
1:48 Cross Entropy defined
2:50 General equation for Cross Entropy
4:11 Calculating Total Cross Entropy
5:41 Why Cross Entropy and not SSR?

#StatQuest #NeuralNetworks #CrossEntropy
Рекомендации по теме
Комментарии
Автор

What is this guy made of??? what does he eat??? Are you a God?? An alien?? You are so smart and dope man!!! How do you do all this? He should be a lecturer at MIT! SO underrated content💞💞💞💞💞💞

meeseeks
Автор

Josh once again demonstrates his amazing ability to simplify complicated topics into elemental concepts that can be easily understood. BAM!

NStewF
Автор

I’ve inquired about the reasons behind using the logarithm to calculate the loss for so long, no one could explain well enough to develop intuition about it. This did it. Thank You!

AlexandrSarioglo
Автор

Your videos are the best for fundamental knowledge regarding ML/AI. I'im in transformer for 4 months, and i come back very often for the fundamental thing. THANKS Josh !!

NJCLM
Автор

Happy teacher's day, from India. It's teacher's day today in India. Thanks for all your teaching

sattanathasiva
Автор

I was predetermined that I would need to watch several videos to grasp this concept. OMG!! You have explained it so intuitively. Thanks a lot for saving my time and energy.

mukulbarai
Автор

I just can’t believe how you opened my eyes. How can you be so awesome 👌👌. Sharing this knowledge for free is amazing.

josyulaprashanth
Автор

I admire this professor a lot. I hope one day be a good teacher like you. Salute from Brazil. In my classes, I try to do it also, take a subject and make it easy as possible.

RaynerGS
Автор

So refreshing and so different from the mathematical riddles that are used in university to teach us this stuff. Thank you!

crealpt
Автор

This is a life saver! Thank you so much again and again. Love your simple and elegant explainations.

somanshbudhwar
Автор

Whenever I 'wonder' while watching statquest, josh tells me the solution just after:)

vijaykumarlokhande
Автор

Thank you for saving so much of my time. There are so many blogs on NN that I have wasted so many hours and days on across various topics, then I found your channel. Thank God for that.

WIFI-nftg
Автор

Another fantastic video. You make these topics such straightforward to understand, that most lecturers overcomplicate by writing down unnecessarily long formulas and just showing off with their knowledge. Thanks a lot!

vusalaalakbarova
Автор

Thank you sooo much I have a masters degree in CS and this is substantially better than anything I learnt in college, I understand it at an intuitive level. Thank you sooo much!!

MLLearner-sbds
Автор

Josh, you are a savior man. I cannot emphasize this enough. I would have given up on understanding these concepts long ago if you had you not made these videos.

SushilKumar-drrj
Автор

Hello Josh!

I have to say WOW!! I love every single of your videos!! They are so educational. I recently started studying ML for my master's degree and from the moment I found your channel ALL my questions that I wonder get answered! Also, I noticed that u reply to every post in the comment section. I am astonished.. no words. A true professor!

Thanks for everything! Thank you for being a wonderful teacher.

neoklismimidis
Автор

I would not be able to get how neural networks fundamentally work without this series. Thank you so much Josh! Amazing and clear explainations!

supersnowva
Автор

The hero we wanted, and the hero we needed, StatQuest...

tiago
Автор

this really clarified so much of the concepts of this topic! i always wondered what is the purpose of having cross entropy when we can use other loss functions like mean squared error! thank you so much!

Noah-zpfn
Автор

What an amazing video. Never found any content or video better than this one anywhere on this topic. Thank you so much.

sanchibharti