Math Behind Neural Networks and Deep Learning: Backpropagation

preview_player
Показать описание
Learning is handled by backpropagation in neural networks. It reflects error to weights based on their contributions. This algorithm calculates contribution values. We'll mention the math behind backpropagation.

Francois Chollet (Google AI Researcher) said that They are neither neural nor networks! They are chains of differentiable, parameterized geometric functions, trained with gradient descent (obtained via chain rule). A small set of high school level ideas put together.

We'll put these ideas together in this video.

Want more? Connect with me here:

If you do like my videos, you can support my effort with your financial contributions on
Рекомендации по теме
Комментарии
Автор

Excellent tutorial. Very easy to understand the backpropagation.

BiranchiNarayanNayak