filmov
tv
GraphSAGE to GraphBERT - Theory of Graph Neural Networks
Показать описание
You start w/ differentiable aggregator functions of GraphSAGE to permutation invariance of graphs, plus a mathematical presentation of convolutional, attentional and message passing Neural Networks.
Resulting in Transformers applied on Graphs, via Laplacian EigenVectors as positional encoding.
Presentation slides by Petar Velickovic (see link below), the author and presenter of the seminar Theory of GNN, 17 February 2021. All rights on these presentation slides belong to him.
Inductive Representation Learning on Large Graphs
Slides on Theory of Graph Neural Networks available for you at:
* Thanks to Petar Velickovic to make informative slides (as pdf file) publicly available*
A Generalization of Transformer Networks to Graphs
#GraphSAGE
#GraphBERT
#GraphNN
00:00 GraphSAGE
07:30 Aggregator Functions
08:30 Permutation Equivariance
12:20 GNN Aggregator
16:12 GNN Meta-structure
19:55 Transformers are GNN
21:15 Graph Laplacian
22:00 Graph Transformer
Resulting in Transformers applied on Graphs, via Laplacian EigenVectors as positional encoding.
Presentation slides by Petar Velickovic (see link below), the author and presenter of the seminar Theory of GNN, 17 February 2021. All rights on these presentation slides belong to him.
Inductive Representation Learning on Large Graphs
Slides on Theory of Graph Neural Networks available for you at:
* Thanks to Petar Velickovic to make informative slides (as pdf file) publicly available*
A Generalization of Transformer Networks to Graphs
#GraphSAGE
#GraphBERT
#GraphNN
00:00 GraphSAGE
07:30 Aggregator Functions
08:30 Permutation Equivariance
12:20 GNN Aggregator
16:12 GNN Meta-structure
19:55 Transformers are GNN
21:15 Graph Laplacian
22:00 Graph Transformer