I Built a Neural Network from Scratch

preview_player
Показать описание


I'm not an AI expert by any means, I probably have made some mistakes. So I apologise in advance :)

Also, I only used PyTorch to test the forward pass. Apart from that, everything else is written in pure Python (+ use of Numpy).

⭐ Other Social Media Links:

Current Subs: 14,219
Рекомендации по теме
Комментарии
Автор

I'm not an AI expert by any means, I probably have made some mistakes. So I apologise in advance :)
Also, I only used PyTorch to test the forward pass. Apart from that, everything else is written in pure Python (+ use of Numpy).

Green-Code
Автор

Ngl using that chef hypothetical is such a neat way of explaining how a neural network functions

fragly
Автор

"let's think of every neuron as a chef... Now, Let 'em cook 🗿" ahh explanation 😭

Yuzuru_Yamazaki
Автор

3:12 the equation states that the loss of a network that returns probabilities with values from 0 to 1 is the expected output × the negative log of the actual output. The reason this works is because -log from 0 to 1 gives a higher loss as the probability approaches 0 and almost no loss as it approaches 1. Multiplying that by the expectes probability makes it so that the network only adjust the values for the outputs you want to approach 1.

SomethingSmellsMichy
Автор

very nice mr green code very nice. You deserve a lot more subs for how good these videos are. Cant wait to see what the future holds

ShabJimJets
Автор

It's pretty cool to implement all this from scratch. I had studied all this a few months ago but forgot most of it because I never practiced it. But this served as a refresher.

DK-oxze
Автор

Subscribed, can't wait for such more informative videos ❤🔥

Ari-pqdb
Автор

thanks! really cool! especially I just learned neural network and watching your video reinforced what I just learned from the class.

Hangglide
Автор

What a great vid, new sub
You summarize accurately two weeks of class of a ML Master where I didn't slept
Great job doing that and understanding the fundamentals
Ignore bad comments
Keep the pace

santiagogonzalez-hcvp
Автор

I went through the same adventure :D I wrote a neural net from scratch in C++ just to get a deep understanding. The backpropagation part took me a while to figure out. I just got to an accuracy of 94% with MNIST, maybe because I still didn't implement optimizers and batches. Thanks for sharing :)

glumpfi
Автор

ur expalination is absolutely fantastic!!

chrisx
Автор

your creativity and passion shine through every project!

LudieMasu
Автор

Awesome bro. Waiting for more videos like this.

Podcast.Motivator
Автор

This is a great video, keep making them!

secondlive
Автор

Great work bro! u got another subscriber !!

baonguyen
Автор

awesome video bro, congrats! I’d like to see a video on how to go from that to a generative AI or a RAG, u know? Are u planning some video like that or do u got some reference?

gabrielrock
Автор

You are the best man, you managed to turn a boring topic into a movie. I think you are going places in the content creation industry. Keep going man 🙌🥇

abdulhadiaa
Автор

cool just subscribed, hope u will continue to post this kind of stuff, loved your work ...

suvojitsengupt
Автор

Your videos are really good and interesting 🔥

nesquickyt
Автор

Great explanation! keep on postig great stuff 😎😎

yassinechritt