AMMI Course 'Geometric Deep Learning' - Lecture 6 (Graphs & Sets II) - Petar Veličković

preview_player
Показать описание
Video recording of the course "Geometric Deep Learning" taught in the African Master in Machine Intelligence in July-August 2021 by Michael Bronstein (Imperial College/Twitter), Joan Bruna (NYU), Taco Cohen (Qualcomm), and Petar Veličković (DeepMind)

Lecture 6: General attributed graphs • Graph networks • DeepSets • Transformers • Neural relational inference • Dynamic Graph CNN • Differentiable Graph Module • Pointer Graph Networks • Weisfeiler-Lehman test • Higher-order GNNs

Рекомендации по теме
Комментарии
Автор

Great lecture Petar! Enjoyed it and learned new stuff. Here are the timestamps followed by some questions:
00:00 Intro, recap
05:15 Including edge features and graph-level features (Graph Nets example)
16:45 Latent graph inference (DeepNets, Transformes, and beyond)
41:25 How powerful are GNNs? (Weisfeiler-Lehman test, higher order GNNs, etc.)
54:10 Outro and Q&A

Qs:
1) 22:00 are they strictly equivalent for all tasks? In GCN every node will have additionally this (albeit same for every node) aggregation of all nodes' features which should be more expressive in the case we're doing node-level classification/regression task? What am I missing out on here? In the case of graph-level classification it seem they are equivalent.

2) 47:07 why are continuous features a game changer, what's the intuition behind that? Did you imply that if we had continuous features it'd be more powerful than WL-1? Or did you just imply that in discrete case GNNs can be less expressive than WL-1 whereas in the continuos case they are always as expressive as WL-1 (ignoring the higher order GNN designs)

3) (nit) 01:02:15 Did you call it a high-six theorem? (first time I heard it referred that way :D )

4) How often are PNAs used in practice? Petar do you know of any specific examples where the additional theoretical expressivity was desired in despite of increased computational requirements?

TheAIEpiphany
Автор

Thanks for the clear and nice lectures on Graphs and Sets!!

yen-linchen
Автор

Is it possible to have some references on coarsening methods for graphs? I don’t think we have access to the resources Petar mentioned. Thank you!

giulioortali
Автор

Maybe I miss something, but do we need also update the x_u node (to become h_u) as we did with x_v -> h_v via phi function? (time 16:24)

konstantins
Автор

Is the equation of the node feature update at 11:10 wrong? I think the summation should be over "v € N_u"

marijansmetko