Basics of Deep Learning Part 7: Backpropagation explained – Gradient Descent and Partial Derivatives

preview_player
Показать описание
In this series we are going to cover the basics of deep learning. And in this video we continue to talk about the backpropagation algorithm. Hereby, we are going to introduce the gradient descent algorithm and we are going to talk about what partial derivatives are.

Edit: At 1:06 I meant to say: "3 digits after the decimal point"

Links:

If you are wondering why the slides don’t disappear even though I am typing in the jupyter notebook, I used AutoHotkey for that. Here is an article that describes how to use it:
Рекомендации по теме
Комментарии
Автор

Why we cannot find a minimum by analytical approach here? Because Galois theorem. The degree of polynomial is 5, so we cannot express the roots in terms of coefficients.

jcoixgz
Автор

This is the best, the simplest, the awesome and too easy to grab explanation of gradient descent. Thank you so much for this great tutorial. You are such a gem.

AmmarAhmey
Автор

This is the best explanation of the Gradient Descent I've ever heard! Thank you very much for taking us through the whole process instead of just giving the final formula.

timur.kabizhanov
Автор

Edit: At 1:06 I meant to say: "3 digits after the decimal point"

SebastianMantey
Автор

Ups. In GENERAL CASE(!!!) you cannot solve equation of 5th degree and higher. This is Galois theorem.

jcoixgz
Автор

Solve[x^5+x - 1 ==0, x] - if you type this in Wolfram Alpha, it gives a real solution. 1/3 (-1 + (1/2 (25 - 3 sqrt(69)))^(1/3) + (1/2 (25 + 3 sqrt(69)))^(1/3)) Although it looks ugly, it's a solution

jcoixgz