Basics of Deep Learning Part 13: Implementing the Backpropagation Algorithm with NumPy

preview_player
Показать описание
In this series we are going to cover the basics of deep learning. And in this video we are going to implement the backpropagation algorithm in code.

Links:

If you are wondering why the slides don’t disappear even though I am typing in the jupyter notebook, I used AutoHotkey for that. Here is an article that describes how to use it:
Рекомендации по теме
Комментарии
Автор

Thank you for explaining each and every small concept in details. This is a must watch series for anyone who wants to learn to implement neural networks from scratch.

SouravGarai
Автор

Actually, your videos are much more helpful than my ML class :-)))
My Prof. on ML class would be jealous.

jcoixgz
Автор

excellent video, straightforward and to the point, dankeschön m8

branoraguz
Автор

Thank you so much for sharing this awesome course. i really appreciate you describing everything from scratch. It really helped me understand the math behind backpropagation. Most other courses on youtube don't go into that much detail unfortunately. And I also really like that you explain the intuitive understanding of deep learning. I played around a bit with your code and added some additional things: the bias, a weight initialization and a relu function. And the softmax for cross validation, which was very important for me. But I think you left that out on purpose so as not to confuse us, which I appreciated.

riccinef
Автор

This is some extra ordinary content ..mate ...

MrAnandml
Автор

Why we use sigmoid? Isnt on the multi class problem we use softmax? Please answer please i really confusseeee

FaktariaID