Graph Theory Blink 10 (3 rules of geometric deep learning: locality, aggregation, and composition).

preview_player
Показать описание
#graphNeuralNetworks #geometricDeepLearning #graphConvolutionalNetworks

Lecture 10 is a brief introduction to geometric deep learning: an exciting research field intersecting graph theory and and deep learning.

In this lecture, I cover the three fundamental rules driving the field of deep learning including:
1) Locality: “tell me who your neighbours are, I will tell you who you are”,
2) Aggregation: “how to integrate information or messages you get from your neighbour?”, and
3) Composition: “how deep you want to learn from your neighbours’ messages?”

**** Resources and further readings ****

1. Stanford course “CS224W: Machine Learning with Graphs”, offered by Jure Leskovec:
A special thanks to my students Alin Banka and Inis Buzi for sharing this with me :)
2. Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C. and Yu, P.S., 2019. A comprehensive survey on graph neural networks. arXiv preprint arXiv:1901.00596.
3. Zhou, J., Cui, G., Zhang, Z., Yang, C., Liu, Z., Wang, L., Li, C. and Sun, M., 2018. Graph neural networks: A review of methods and applications. arXiv preprint arXiv:1812.08434.
4. Graph-based deep learning literature in top conferences:
Рекомендации по теме
Комментарии
Автор

This is super awesome! Really love lucid visualizations with great explanations!

arkaung
Автор

If we have feature vectors for the graph arcs, how will the aggregation formula be changed?

Ali-neel
Автор

Thank you very much for that great explanation. Could you please share simple source code for a GNN? I know I can find the python codes on Github but for me, they are really complicated.

Ali-neel