Equivariant Neural Networks | Part 1/3 - Introduction

preview_player
Показать описание
▬▬ Papers / Resources ▬▬▬

▬▬ Used Music ▬▬▬▬▬▬▬▬▬▬▬
Music from #Uppbeat (free for Creators!):
License code: WXVHOOZRRWDUCKIU

▬▬ Used Icons ▬▬▬▬▬▬▬▬▬▬

▬▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬
00:00 Introduction
00:45 Equivariance and Invariance
03:03 CNNs are translation equivariant
04:00 Math notation
04:25 Visual intuition
05:08 Symmetries
06:22 Why are CNNs not rotation equivariant?
07:14 Inductive biases reduce the flexibility
08:10 What's wrong with data augmentations?
09:32 Motivations for Equivariant Neural Networks
09:55 You've unlocked a checkpoint.
10:07 Naturally occuring equivariance
10:50 Group Equivariant Convolutional Neural Networks
11:37 Group Theory (on a high level)
12:41 An example and the matrix notation
13:50 Group axioms
14:32 Cayley tables
15:33 Examples for groups
16:38 Applications of Equivariant Neural Networks
18:30 Final Checkpoint :)

▬▬ Support me if you like 🌟

▬▬ My equipment 💻
Рекомендации по теме
Комментарии
Автор

Honestly this is one of the best introduction to the topic. A lot of the lectures directly dive / start with group theory and people without relevent background immediately loose interest seeing the mathematical concepts and axioms. This channel deserves more subs and views ❤❤

ajwadakil
Автор

You have a good ability to explain difficult subjects easily. There are tutorial videos for Geometric Deep Learning from Bronstein(who organized GDL tutorial school with Cohen, Burna etc) which includes this concept, but their tutorial needs more math background. Excellent.

sari
Автор

I love your content, and often come back to recall important concepts. Thank you very much, and I hope that soon I will be able to afford to buy you a coffee.

jangradkowski
Автор

i've been looking at this topic these days, super helpful to wrap my head around better!

Blueshockful
Автор

Wonderful video, the examples are incredibly intuitive.

spaceflame
Автор

hi, where the lower complexity of the network is stated as an advantage.. complex in what sense? my definition of complex network means there are more filters/neurons/weights to learn.. so how would equivariance reduce connections? those are fixed by our hand-crafted layer topology, no?

imolafodor
Автор

Clean as fuck. Thank you very much. Nice visuals nice explanations. Easy to follow

gapsongg
Автор

Nice video, could you please also create video of capsule networks, through caps-net you can also achieve equivariance in images.

shaz-z
Автор

Is it possible to share the slides as well?

chinmay.prabhakar