Coding a Neural Network from Scratch in Pure JAX | Machine Learning with JAX | Tutorial #3

preview_player
Показать описание
❤️ Become The AI Epiphany Patreon ❤️

👨‍👩‍👧‍👦 Join our Discord community 👨‍👩‍👧‍👦

Watch me code a Neural Network from Scratch! 🥳 In this 3rd video of the JAX tutorials series.

In this video, I create an MLP (multi-layer perceptron) and train it as a classifier on MNIST (although it's trivial to use a more complex dataset) - all this in pure JAX (no Flax/Haiku/Optax).

I then add cool visualizations such as:
* Visualizing MLP's learned weights
* Visualizing embeddings of a batch of images in t-SNE
* Finally, we analyze the dead neurons

Credit:

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

⌚️ Timetable:
00:00:00 Intro, structuring the code
00:03:10 MLP initialization function
00:13:30 Prediction function
00:24:10 PyTorch MNIST dataset
00:31:40 PyTorch data loaders
00:39:55 Training loop
00:49:15 Adding the accuracy metric
01:01:45 Visualize the image and prediction
01:04:40 Small code refactoring
01:09:25 Visualizing MLP weights
01:11:30 Visualizing embeddings using t-SNE
01:17:55 Analyzing dead neurons
01:24:35 Outro

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️

If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!

Huge thank you to these AI Epiphany patreons:
Eli Mahler
Petar Veličković
Bartłomiej Danek
Zvonimir Sabljic

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

#jax #neuralnetwork #coding
Рекомендации по теме
Комментарии
Автор

At 55:00 a workaround to put all stuff at once was to set batch_size to the full dataset

hosseinarjomandi
Автор

loving this series, will be sad after i watch the fourth and final video :(

jonathanstreater
Автор

Thank you for the content! Would it be possible to make tutorials on JaxOpt by any chance? :)

lidiias
Автор

third video, already? take it easy with this, you are video-bombing I'm still on the first one. you're making me feel guilty of not pushing enough 😔

iskrabesamrtna
Автор

Thanks for the informative peek into an adept coder's workflow. Great content 👍

suwang
Автор

By any chance do u have any tutorial explaining how to make ur own dataset ?
Ur videos are awesome btw ! 😎

leonelp
Автор

Thanks for this. It was enjoyable to follow along 👍

gergerger
Автор

Watch me code a Neural Network from Scratch! 🥳 in this 3rd video of the JAX tutorials series.

This is the first video of this kind (coding from scratch) on my YouTube channel - your feedback is much appreciated!

If people find this useful I'll be pumping out more of these videos in the future. I enjoyed making this one.

You'll be able to see how I think while I'm writing code + some messiness and the art of googling haha.

TheAIEpiphany
Автор

Imao sam addonse kao ti, ali da znas session buddy moze da se desi nekad da pukne i da ne sacuva. Iznervirao sam se dosta puta, pa koristim Export Tabs i lepo skinem tabove sacuvane. Sto je sigurno sigurno je. Preporucujem AdClose, koji gasi veb sajtove koje ubacis u blacklistu, vecina pop upova otvara about:blank.

banuwii
Автор

Would be great to watch video about training with stax...)

tempdeltavalue
Автор

Concerning dead neurons: After training for 5 epochs, I got something like 0, 170 for the two layers.
The exact numbers are random (due to dataset random shuffling not seeded), but it's always 0..2 for the first layer, and ~170 for the second.
Your values 0, 4 must have been for an untrained network!

oleksiygrechnyev
Автор

Thanks the great tutorial. How do you change the font type in colab?

changtimwu
Автор

Hi, hope you're doing well
Excuse me, i have a problem in (the annotated GAT PPI.ipynb)
I cannot load the ppi
I receive an error which says:
No such file or directiry: c:\\ users feats.npy
Would u please help me?
Thank u

nastaranmarzban
Автор

The most crucial mistake in the video: "an MLP" is technically more correct than "a MLP" because M is pronounced with a vowel sound.
Good video otherwise. 😉

mikhaildoroshenko
visit shbcf.ru