PyTorch Tutorial 11 - Softmax and Cross Entropy

preview_player
Показать описание
New Tutorial series about Deep Learning with PyTorch!

In this part we learn about the softmax function and the cross entropy loss function. Softmax and cross entropy are popular functions used in neural nets, especially in multiclass classification problems. Learn the math behind these functions, and when and how to use them in PyTorch. Also learn differences between multiclass and binary classification problems.

- Softmax function
- Cross entropy loss
- Use softmax and cross entropy in PyTorch
- Differences between binary and multiclass classification

Part 11: Softmax and Cross Entropy

📚 Get my FREE NumPy Handbook:

📓 Notebooks available on Patreon:

If you enjoyed this video, please subscribe to the channel!

Official website:

Part 01:

Code for this tutorial series:

You can find me here:

#Python #DeepLearning #Pytorch

----------------------------------------------------------------------------------------------------------
* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏
Рекомендации по теме
Комментарии
Автор

nice video, one nit: it's important to remember that confidence scores != probabilities. they may HIGHLY correlate with probability, they may LOOK like probabilistic outputs but they are NOT empirically derived probabilities in the strict sense. a model can have a confidence of 0.99, but that is not the probability that label is correct. if a research scientist would like, they can correlate confidence scores to probabilities empirically using testsets.

bpmoran
Автор

Awesome tutorial. I do find some of the concepts a bit tough to grasp, but when I review them it helps a ton. I also like to review them by rewatching your videos. Keep up the great work, and I'm looking forward to checking out other tutorials on your channel.

Yoyo-sykl
Автор

This tutorial is one of the best out there. Thank you so much for making this. It is really appreciated.

manuelsilveriof
Автор

Your videos not only are educational and informative, but they are also very enjoyable. Thank you!

aminaleali
Автор

well, my first error i figured out before watching this video was using softmax at the end of my model, and the second one was using logits as Y target, and i learn it just now after playing with my modle for weeks... Thanks for poiting that out for others

na_
Автор

Very nice, clear, and detailed PyTorch tutorial!!! I haven't been able to find anything as good so far!

Please keep up the good work and continue to make more tutorials!

tcidude
Автор

def softmax(x):
return np.exp(x) / np.sum(np.exp(x), axis=1, keepdims=True)


x = np.array([[2, 1, 0.1]], dtype=np.float32)

print(softmax(x))


this should be the softmax for handling multiple example in batch

donfeto
Автор

The slide at 6:50 was very helpful. Thank you.

gordonlim
Автор

It's very friendly to beginners like me, very awsome video and author!

xingfenyizhen
Автор

Sir your video was amazing! THank you for showing how softmax and cross entropy is implemented in python!

MinhLe-xkrm
Автор

RuntimeError: size mismatch (got input: [3], target: [1]). If you get something like this, you havent put double brackets when declaring good/bad prediction array. [[2.0, 1.0, 0.1]]
from my understanding it has to do with the outer/ inner dimension

gabbebelz
Автор

Can anyone please help me why there's Y = torch.tensor([0]), there should be 3 values inside ???

vatsal_gamit
Автор

Thank you very much for the valuable content! Very helpful tutorials Pytorch!

annalavrenova
Автор

3:31 what does N represent?

Also at 7:49, how do we actually represent and classify the data (as Y = torch.tensor([0])) I am confused?
Changing 0 to 1 and 2 produces results so I thought they represented positions as in a list (0: [1 0 0], 1: [0 1 0], 2: [0 0 1]); however, that doesn't appear to be the case since they yield different answers compared to the numpy method (I used the same Y'^).

egeerdem
Автор

Thank you, Sir. You are doing a great job.

haroldsu
Автор

thank u so much it is very good explanation thnks a lot

ahmedchaoukichami
Автор

Great Tutorial, but why does 3 sample look like [2, 0, 1] but not [[2], [0], [1]], thanks!!

tz
Автор

Very Good, explanation of Multi class vs Binary classification

asheeshmathur
Автор

The last 4 minutes of this video is very important. Could you please explain what to do when I am using MSELoss for autoencoder based networks? For cross-entropy loss it's working (although that's incorrect) but for MSELoss it's not working.

saifulislamsajol
Автор

hey patrick, great stuff, can you upload the ppt for this part to on your github repository, i checked it's not there, it'll be extremly helpful

adityashrivastava