Neural Networks using Lux.jl and Zygote.jl Autodiff in Julia

preview_player
Показать описание

-------

-------

-------

-------

Timestamps:
00:00 Intro
01:23 Imports
02:33 Constants/Hyperparameters
03:27 Instantiating the random number generator
03:57 Generate a toy dataset of noisy sine samples
06:11 Define neural architecture
10:13 Initialize network parameters (and layer states)
11:39 Network prediction with initial parameter state
14:25 Forward function: parameters to loss mapping
16:12 Preparing the optimizer
16:53 Train loop start
17:10 Transformed forward pass
18:51 Using vjp/back function for reverse pass
21:03 Update parameters with the gradient (parameter cotangent)
21:35 Finish training loop
22:21 Run training loop & investigate loss history
23:17 Prediction with trained parameters
24:42 Summary
25:53 Outro
Рекомендации по теме
Комментарии
Автор

This is seriously an amazing channel, time ago I read some performance comparison among Julia and other high level programming languages, what astounded me was the flexibility of Julia together with its speed (close to C++). So now the question to you can Julia overcome Python? I'm currently using Python but it has some limitations first of all speed and lacks of backwards-compatibily while Julia seems to solve this two main drawbacks, but ok on its side Julia doesn't have all of those libraries and strong community of Python (tensorflow by Google for instance).

matteopiccioni
Автор

Hi! Thank you very much for the video! I just wanted to ask you, what is the meaning of the peak around epoch 1.7e4 in the plot of the loss function?

diegosorte
visit shbcf.ru