Deep Neural Network Python from scratch | L layer Model | No Tensorflow

preview_player
Показать описание
We will create a Deep Neural Network python from scratch. We are not going to use Tensorflow or any built-in model to write the code, but it's entirely from scratch in python. We will code Deep Neural Network with L layers in python from scratch.

This video is for those enthusiasts who love to know under-the-hood details behind how things work. You can directly use the TensorFlow model to create a Deep Neural Network, but if you are curious to know how things work in python from scratch, then this video is for you.

Understanding Deep Neural Network in Python from scratch helps you learn how deep learning actually works and gives you confidence in understanding Machine Learning.

And if you have followed my playlist on Neural Network, then writing this code will be super simple for you. I have tried to explain a very difficult code in a simple manner, so please let me know in the comments section what you feel about this video.

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Timestamps:
0:00 Coming Next
0:30 Intro
3:14 Overview
6:32 Initializing Parameters
14:57 Forward Propagation
23:36 Cost Function
26:22 Backward Propagation
33:10 Update Parameters
34:23 Complete Model
40:36 Improving Model Look
48:44 End

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Рекомендации по теме
Комментарии
Автор

Dude....your videos are FABULOUS!!
keep going!!we need you!!

mohamedkh
Автор

Bro, your content and you as well are AWESOME.
Liked and subbed, keep it up! <3

Felloggs
Автор

Bro this is the simplest explanation in whatever I have seen

lokendrakumar
Автор

just what i need to understand with clarity

SatyamKumar-xgsg
Автор

You and Josh Starmer (StatQuest) totally demystify DNNs. Thanks!!!

matthiasblumrich
Автор

It's a pleasure to learn with your lesson!

matteomanzi
Автор

Straight forward and to the point! Good video

lecturesfromleeds
Автор

thank u, it the best and clear vedio I have watched😍U an extremely handsome man!!!

threetime-nedc
Автор

Thanks bro ! you are simply great teacher

engineeringdecrypted
Автор

hey, great playlists, i can now say i can really understand deep learning, can you please make a video explaining the perceptron algorithm and its complexity along with kernels ?!

ikrameounadi
Автор

very decent explanation, would you like to do the same for CNN?

ajay
Автор

Thank you so much sir, very much helpful 🙂

ashraf_isb
Автор

Very helpful. Please make a playlist for GAN and transformers like you made for CNN.

svk
Автор

I really liked it. I would also like for you to create a video about LSTMs and Transformers (from scratch).

mustafatuncer
Автор

In terms of performance, do all the string concatenations to get the parameters from dicts (the "Z" + str(l) and stuff) slow down the code, or is this trivial compared to the actual parameter multiplication?

robertdemka
Автор

Good stuff. No videos for a year? please keep uploading. Thank you

divyagarh
Автор

thanks a lot bro your videos really helped me

soumyadeepsarkar
Автор

is this video suitable for beginers? If not recommend me what to watch before jumping to this

uchindamiphiri
Автор

Hi, Mr. Jay Patel!
Thanks a lot for such explaining!
Why you don't use derivative in the output layer (AL) for sigmoid function during backward pass?
Can we state that the weights of the last layer (WL) learn without taking into account back pass of the output error (AL-Y) through sigmoid?
If yes, why you and other guys don't use it?

ИсмоилОдинаев-йя
Автор

Hey I implemented Backpropagation on = dL/dA dot product on dA/dZ according to your video....In my implemented, dL/dA and dA/dZ are both the shape (training size, image height, image weight, channel size)...If this is corrected, how should we dot product it....

zonunmawiazadeng
welcome to shbcf.ru