Thank you. Explaining something like that and looking up all the things during your explanation is perfect. I also liked how slangy you told things. Really easy to follow. More please!!
I think it will be just a matter of time until the youtube algorithm will hook your videos and start recommending them.
gapsongg
Great explanation, thanks. It cleared up a couple things I was not completely understanding
dustfurn
Thanks for the video! In general, the number of training epochs and compute time per epoch are independent. So you're right that no augmentation could make their time per epoch shorter, but training in less epochs is due to a better model architecture. Also load data + augmentation can happen on the cpu in parallel to forward/backward pass on the gpu.
uberdavid
You are very good at explaining. I have no background in ML and i was able to understand it. Thank you
arx
It seems spot on, as of late, GPT-4 sometimes writes 2 different answers side by side and asks you to pick the best!
freedom_aint_free
thank you for the video. i learned a lot from you.
since Vit is data hungary, i am wondering how can i train CNN(resnet or efficientnet) in JEPA way?
do you have any comment on this? thank you again for creating such a detailed work.
buh
14 is the patch size and not the number of patches.
symbolorate
Soumith Chintala is the guy behind Pytorch btw
gan
It's ridiculous to suggest that Yann Lecun is implying others are copyright his idea during your background explanation. The main author is not Yan Lecun. Even if Lecun were the author, he was simply explaining his perspective when he came up with new idea. His papaer was not about why you should view others work under his lens. It's about whathe was thinking to come up with new idea.
So you are suggesting anyone explaining themselves as narcissistic? Better focus on the idea than wasting time on policing narcissism.
hunghuynh
I don't know why it feels like you are reading the paper for the first time in this video. Maybe if you read the paper beforehand and then explain it here, the videos would be much shorter.