AI Weekly Update - June 16th, 2021 (#35!)

preview_player
Показать описание
Content Links Below:

Chapters
0:00 Introduction
15:17 Generative Models as a Data Source for Multi-View Representation Learning
18:14 Learning to See by Looking at Noise
21:01 Knowledge Distillation: A Good Teacher is Patient and Consistent
24:30 Does Knowledge Distillation Really Work?
26:37 AdaMatch
29:47 Self-Damaging Contrastive Learning
31:35 Masked Self-Supervised Transformer for Visual Representation
33:12 Data-Efficient Instance Generation from Instance Discrimination
34:34 Scaling Vision Transformers
36:48 CoAtNet
38:17 Improving Language Model Behavior by Training on a Curated Dataset
39:11 Dynamic Language Models for Continuously Evolving Content
40:38 GPT-J-6B
41:43 An Empirical Survey of Data Augmentation for Limited Data Learning in NLP
43:03 Meta-Learning with Fewer Tasks through Task Interpolation
44:24 Exploring the Limits of Out-of-Distribution Detection
45:10 Causality in Deep Learning
46:42 A graph placement methodology for fast chip design
47:49 Pretraining Representations for Data-Efficient Reinforcement Learning
49:20 CodeXGLUE
49:52 HF Zero Shot Classification
50:24 HuggingFace Course
50:52 Neural Structured Learning
51:09 GPT-3 vs. GPT-Neo Data Picture
52:12 Audio to video generation
52:55 New PaperswithCode Datasets
Рекомендации по теме
Комментарии
Автор

Thank you very much for doing these types of videos! The series is really helpful and provides a snapshot of interesting and valuable works in what has become an over hyped field riddled with projects presented as research.

um
Автор

Thank you very much for doing this series! It's a fantastic snapshot of interesting and valuable works in what has become an over hyped field riddled with projects presenting as research.

um
Автор

It's really helpful to get some of latest advancements in DL in a single video... Thank you!

roughr
Автор

First, thanks, as always very valuable information. Only one comment, is there a reason to don't divide the video into smaller ones? I think that is in that way you can get more views and is a little easier see it, regards from Colombia

juanmanuelcirotorres