Lecture 6: Backpropagation

preview_player
Показать описание
Lecture 6 discusses the backpropagation algorithm for efficiently computing gradients of complex functions. We discuss the idea of a computational graph as a data structure for organizing computation, and show how the backpropagation algorithm allows us to compute gradients by walking the graph backward and performing local computation at each graph node. We show examples of implementing backpropagation in code using both flat and modular implementations. We generalize the notion of backpropagation from scalar-valued functions to vector- and tensor-valued functions, and work through the concrete example of backpropagating through a matrix multiply operation. We conclude by briefly mentioning some extensions to the basic backpropagation algorithm including forward-mode automatic differentiation and computing higher-order derivatives.

_________________________________________________________________________________________________

Computer Vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. Core to many of these applications are visual recognition tasks such as image classification and object detection. Recent developments in neural network approaches have greatly advanced the performance of these state-of-the-art visual recognition systems. This course is a deep dive into details of neural-network based deep learning methods for computer vision. During this course, students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in computer vision. We will cover learning algorithms, neural network architectures, and practical engineering tricks for training and fine-tuning networks for visual recognition tasks.

Рекомендации по теме
Комментарии
Автор

this lecture is an example of a perfect technical lecture

quanduong
Автор

Prof...stop ...stop...it's already dead! Oh BP you thought you were this tough complex thing and then you met Prof. Justin Johnson who ended you once and for all!

The internet is 99.99% garbage but content like this makes me so glad that it exists. What a masterclass! What a man!

sachinpaul
Автор

Amazing! One of the best Backprop explanation out there!

ritvikkhandelwal
Автор

Finally!! I understood how to apply backpropagation. Thank you sir! Thank you!

achronicstudent
Автор

Sir, you are amazing! I've wasted hours reading and watching internet gurus on this topic, and they could not explain it at all, but your lecture worked!

piotrkoodziej
Автор

Dr. JJ, you sly sun of a gun. This is one of the best things ever. 47:39, the way he asks if it is clear. It is damn clear man. Well Done!

vardeep
Автор

I work in ML and am doing review for interviews, this lecture is extremely thorough!

odysy
Автор

Best lecture ever on explanation of backpropagation in math

rookie
Автор

finally some coverage on backprop with tensors

dbzrz
Автор

Such an amazing lecture with easy-to-understand examples!

kentu
Автор

Future reference for anybody, but I think there's a typo @ 50:24. It should be dz/dx * dL/dz when using chain rule to find dL/dx

ryliur
Автор

At 58:56 prof Johnson tells something huge imho, the final equation is not formed by jacobians, finally I got it..simply the best explanation on the backprop .Thank you prof Johnson

liviumircea
Автор

You earned a like, a comment and a subscriber ... what an explanation .

KeringKirwa
Автор

what a superb lecture on backpropagation. simply amazing.

VikasKM
Автор

Thank you very much! I really enjoy this lecture! Hello from Russia with love :)

mihailshutov
Автор

I don't get how back propagation tutorials by 3B1B, StatQuest, etc, get so much praise, but neither of them are as succinct as you were in those first two examples. Fuck that was simple.

tomashaddad
Автор

10:02. Dr. Johnson means, "right to left" not "left to right"

shoumikchow
Автор

The good thing about these lectures is that finally Dr.Johnson has more time to speak compared to cs231n !

mohamedgamal-giws
Автор

Such awesome and intuitive explaination!

minhlong
Автор

how come u are getting the value of e^x as -0.20. Could u explain

anupriyochakrabarty