Neural Network learns sine function in NumPy/Python with backprop from scratch

preview_player
Показать описание

-------

-------

-------

Timestamps:
00:00 Intro
02:00 The dataset
02:25 MLP architecture with sigmoid activation function
03:26 Forward/Primal pass
06:40 Xavier Glorot weight initialization
08:06 Backward/Reverse pass
14:15 "Learning": approximately solving an optimization problem
15:10 More details on the backward pass and pullback operations
16:52 Imports
17:07 Setting random seed
17:24 Constants/Hyperparameters
18:08 Toy dataset generation
19:56 Defining nonlinear activation functions
20:39 Implementing Parameter initialization
24:45 Implementing Forward pass
27:20 Implementing loss function
29:06 backward function of the loss
30:36 Backward pass of the network
45:29 Training loop
48:15 Plot loss history
48:36 Plot trained network prediction
49:20 Summary
50:59 Outro
Рекомендации по теме
Комментарии
Автор

Du warst vor 5 Jahren in Braunschweig mein Thermo-Tutor und hast damals schon durch deine Gabe geglänzt, komplexe Sachverhalte verständlich und vor allem mit einer ansteckenden Begeisterung rüberzubringen. Freut mich sehr zu sehen, dass du weiterhin Spaß daran hast!

mDsStudio
Автор

great tutorial. Should the last line in the last layer not be y4hat =I(y4tilde)?

corneliusotchere
Автор

Awesome video as usual!! I been following your channel for a while now and all I can say is Amazing Content. I was sending if you could make a video on Fluid-Structure Interaction Simulation such as deformable beam that would be great.

JudeLight-hi
Автор

Thank you very much.

I have also tried to implement various type of ML algorithms and write the document about it in Google Colab too.

phurisottatipreedawong