Divide and Contrast Explained!

preview_player
Показать описание
This is an interesting strategy to utilize clustering in the contrastive self-supervised learning pipeline. The three-stage pipeline trains local expert models that have a better signal for representation learning due to the cluster assignments!

Paper Links:

Chapters
0:00 Paper Title
0:04 Heavy-tailed unlabeled data
1:05 DnC Algorithm
3:26 MoCLR Design
5:55 Expert Distillation
7:49 Results
12:28 Algorithm Ablations
13:20 Clustering in Self-Supervised Learning
15:35 Class Imbalance in Self-Supervised Learning

Thanks for watching! Please Subscribe!
Рекомендации по теме
Комментарии
Автор

Another home run with the paper selection and clear walk-through! I think there is great potential in self supervised learning as it is perfect for including domain knowledge as a selection metric/bias. Once you know the domain you can construct transformations that conserve the underlying symmetries and therefore can "easily" classify the data. Of course this is easier said than done.
Again, thank you very much for doing these videos. Your delivery is something other ml channels could take note on how to present a paper.

um
Автор

Could this work with patches if there's multiple classes per image

JohnDoe-thcc
Автор

either channel or youtube keep disable my comments .all i am asking explanation for simplleHTR .i even try to email channel.Please reply to email, even negative replied is welcome

namratamulwani