Just In Time (JIT) Compilation Speed Test

preview_player
Показать описание
In this JAX tutorial, I looked into Just In Time (JIT) compilation. I also saw how JAX translates python code to jaxpr programming language for efficient execution on GPU and TPU.
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
- sponsor this channel on GitHub Sponsors:
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
📹 Video edit: Adobe Premiere Rush
🎧 Audio enhancement: Adobe Podcast
🖼️ Thumbnails: GIMP
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Chapters:
00:00 start
00:11 about JAX documentation
01:44 start of the code
02:29 MLP architecture
03:31 train functions
04:19 MLP JIT training
05:14 MLP no JIT training
05:56 loading CIFAR10
06:31 CNN architecture
07:08 CNN JIT training
07:30 CNN no JIT training
07:51 plotting the running time
08:18 why JIT is so fast?
10:10 understnding _cache_size()
11:53 checking _cache_size()
13:13 jaxpr code
15:32 final remarks
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#jax #flax #jaxpr #XLA #convolution #ai #deeplearning #machinelearning #python #neuralnetworks #artificialintelligence #pytorch #torchvision #convolutionalneuralnetworks #image #imageclassification #computervision #GPU #TPU #tutorial
Рекомендации по теме