Neural Networks from Scratch - P.6 Softmax Activation

preview_player
Показать описание
The what and why of the Softmax Activation function with deep learning.

#nnfs #python #neuralnetworks
Рекомендации по теме
Комментарии
Автор

Received the hardback version of your book three days ago and I am very pleased with the print quality, it is also a rather hefty book, I am so glad I pre-ordered it. If there is anyone reading my comment and wondering if it's worth to buy the book, then just stop wondering and buy it! Greatly recommended!

MartinPHellwig
Автор

Sentdex man: you torture me so hard when you don't upload videos regularly. THANKS FOR EVERYTHING!

nicop
Автор

OMG he's back!!! To all those who doubted this series - never doubt the sentdex you donuts

BehindEnemyChimes
Автор

I think one of the most overlooked things about online video learning is that you can pause as much as you like, adjust playback speed, rewind, listen again etc. Just noticed this, if I was to do this in an actual real time lecture I'd probably learn as well. All the discomforts (toilet breaks, hunger), distractions etc can be dealt with immediately and you can resume studying.
And I'm not even factoring in all the timing / travelling stuffs that comes with real time lectures

blzahz
Автор

Late to the party, but I really want to thank you for walking through what is happening step by step. So many ai tutorials are treating it like magic rather than actually running through the math, which is terrible for actual learning.

xaviermagnus
Автор

Holy moly I almost stopped going through the playlist I was about halfway through and it looked like it was an abandoned project and then this comes up in my feed. No excuses now.

AndreyAntonchik
Автор

After what happened with Cyberpunk 2077, this comeback brings peace to my soul.

Tailz
Автор

Where were you am way too excited to see u back..please finish this series

souravjha
Автор

I never really dug into the softmax before, but with a background in theoretical physics (statistical mechanics), it is truly fascinating to me that we see normalized inputs (which in effect, create the function exp[-x]) leading to the exact same equation as the canonical probability distribution.

Neural nets (even basic ones) really are fascinating machines. This series has given me such a new perspective on them. Keep up the great work guys.

michaeljburt
Автор

Mathematically speaking, I thought it was quite elegant how the normalized values after exponentiation (without subtraction of max value) were the same as the normalized values after exponentiation (with subtraction). Really cool result! I believe that this might be yet another good reason why the Exponential function is used in the Softmax Activation Function (in addition to the fact that it deals with negative numbers in a convenient manner).

narinpratap
Автор

Honestly since I started this series I've been considering getting the book and to be honest at the beginning I was completely loss but as I continued and re-watched the videos to get a better understanding of some concepts I finally understand...






I need that book

luckydice
Автор

I'm loving this series of videos. The main reason the way you show the reasoning behind every step you take. Starting with verbose but easy to understand lines of code with a couple of for loops, to then express it in terms of much more readable numpy functions. Can't wait for the next one.

MrLipdx
Автор

Thank You very much sir for your hardwork and explanation. we are very blessed to have a teacher like you. Love from India ❤️

unknownusername
Автор

Thank you Sentdex! Can’t wait for more videos. My copy of the book arrived last week and if anyone is on the fence about buying it... Do it!

wbrozovic
Автор

WOW this is really fresh, it hasn't been added to the playlist...yet. 2 thumbs up!!!

kenchang
Автор

This series is so incredibly valuable! You are a great teacher, Harrison! I'm wondering, if the book is already finished, why is the video series taking so long to complete? It's been more than a year since part 1. Not complaining, just wondering. I'd like to continue this till the end and be a neural network master ;)

willemvdk
Автор

Thanks for restarting the series bro... when you discontinued the series I was like oh mummy god ... now I’m eagerly waiting for this series I hope u won’t do that again

madhusudhanreddygone
Автор

I received the hardcopy of the book although it came extremely late (1 month!) but i am pleased with the print quality and the ability to read from a physical book rather than a monitor!
I already read the book via the softcopy but i recommend that you read the book at least twice to grasp difficult concepts.
Harrison and Daniel have done an OUTSTANDING job in simplifying many hard to understand topics.
Keep up the good work and please add more videos and animations.
cant wait to see the video explaining chapter 9 (back propagation)

moodiali
Автор

Finally you are back! Glad to see you are doing well. Of course there was no way I wasn't buying that book. I haven't get to read it yet, but so far it seems amazing, so happy to have it! Please keep doing what you love!

mangaart
Автор

Justin Gathje recovered from the Khabib fight and is back to coding NN I'm glad!

chronicsnail