Backpropagation In Depth

preview_player
Показать описание
In this video, we'll take a deep dive into backpropagation to understand how data flows in a neural network. We'll learn how to break functions into operations, then use those operations to build a computational graph. At the end, we'll build a miniature PyTorch to implement a neural network.

Chapters:
00:00 Introduction
6:19 Staged softmax forward pass
9:24 Staged softmax backward pass
23:19 Analytic softmax
29:30 Softmax computational graph
42:26 Neural network computational graph

This video is part of our new course, Zero to GPT - a guide to building your own GPT model from scratch. By taking this course, you'll learn deep learning skills from the ground up. Even if you're a complete beginner, you can start with the prerequisites we offer at Dataquest to get you started.

If you're dreaming of building deep learning models, this course is for you.

Best of all, you can access the course for free while it's still in beta!

Sign up today!
Рекомендации по теме
Комментарии
Автор

I actually found Andrej Karpathy 2 hours video for his micrograd library (part of NN: Zero to Hero course) and this is absolutly perfect explanations for once who wich to understand how NN behaves and specifically how back propagation works and all gradients are calculated on that backmove through the calculation graph. With all respect to Vik, I considered Andrej's explanation a bit more clear for understading for newbies also with bunch of general Python coding technics I was not familar with.

anfedoro