Understanding self supervised learning dynamics without contrastive | Outstanding Paper | ICML 2021

preview_player
Показать описание

0:00 Understanding self-supervised Learn Dynamics without Contrastive Pair
0:27 Non-contrastive SSL (BYOL/SimSiam)
1:37 A simple model
2:45 The Dynamics of Training Procedure
4:26 Part II Assumptions
5:56 Symmetrized Dynamics Under the three assumptions, the dynamics becomes
7:13 Why non-contrastive SSL doesn't collapse?
9:31 Part III The Effect of Weight Decay n
10:06 The Benefit of Weight Decay
11:20 Exponential Moving Average rateſ
11:52 Part IV Direct Pred
14:33 Performance of DirectPred on ImageNet
15:53 Conclusion

Award-Winning | Outstanding Paper | Best Honorable Mention Award Papers of ICLR and ICML 2021 Conferences
Momentum Residual Neural Networks | ICML 2021
Score-Based Generative Modeling through Stochastic Differential Equations | Best Paper | ICLR 2021
Rethinking Architecture Selection in Differentiable NAS | Best Paper Award | ICLR 2021
Optimal Rates for Averaged Stochastic Gradient Descent under Neural Tangent Kernel Regim | ICLR 2021
Neural Synthesis of Binaural Speech From Mono Audio | Best Paper Award | ICLR 2021
Learning Mesh-Based Simulation with Graph Networks | Best Paper Award | ICLR 2021
EigenGame: PCA as a Nash Equilibrium | Outstanding Paper Award | ICLR 2021
Solving high-dimensional parabolic PDE using the tensor train format | Outstanding Paper | ICML 2021
Understanding self-supervised learning dynamics without contrastive | Outstanding Paper | ICML 2021
Oops I Took A Gradient: Scalable Sampling for Discrete | Outstanding Honorable Mention | ICML 2021
Optimal Complexity in Decentralized Training | Outstanding Paper Honorable Mention | ICML 2021
Unbiased Gradient Estimation in Unrolled Computation Graphs | Outstanding Paper Award | ICML 2021
H / T : Khawar Islam
#computervision #machinelearning #artificialintelligence #AI
Рекомендации по теме