Backpropagation in Neural Networks - EXPLAINED!

preview_player
Показать описание
Greetings fellow learners! This is the 2nd video in a playlist of videos where are are going to talk about the fundamentals of building neural networks! Here, we cover back propagation

ABOUT ME

RESOURCES

PLAYLISTS FROM MY CHANNEL

MATH COURSES (7 day free trial)

OTHER RELATED COURSES (7 day free trial)
Рекомендации по теме
Комментарии
Автор

If you think I deserve it, please consider giving this video a like as it would help tremendously. Thank you!

CodeEmporium
Автор

sir, please make a playlist for graph neural network(GCN, GAT etc)🙏🙌🙌

keshavraghuwanshi
Автор

can a neural net have only singlw weight per node? intead of weights per input ..

bofloa
Автор

There is NO VIDEO on the YOUTUBE about training mamba s4 from scratch. Can You do one, please? I belive it will be GOAT in terms of views and new subs.

АлексейДорошев-зв
Автор

Quiz:
#1: B
#2: C
#3: B

I hope :D

intptointp
Автор

B, C, B? Fantastic video btw, perfect for people like me with a destroyed attention span…

sanhar
Автор

For quiz 1 : B. Because neural networks are in supervised learning so we trained it with labelled data?

Euro_notus
Автор

Good video, but you need to get back to digging deeper into transformers...

rpraver