Pytorch Geometric tutorial: Graph attention networks (GAT) implementation

preview_player
Показать описание
In this video we will see the math behind GAT and a simple implementation in Pytorch geometric.

Outcome:
- Recap
- Introduction
- GAT
- Message Passing pytroch layer
- Simple GCNlayer implementation
- GAT implementation
- GAT Usage

Download the material from the official web site:
Рекомендации по теме
Комментарии
Автор

I did encounter several Youtube videos explaining the paper, but I am glad I bumped into your video with the actual implementation code explained so elegantly. Thanks for taking the time to share it.

tamago_roe
Автор

Thanks for putting together these PyTorch Geometric tutorials, really valuable material 👏

rsilveira
Автор

guy was bored out of his mind 15:13 deep breath LOL. Thank you though I appreciate this work helps me.

aamm
Автор

Great series so far. The explanations are extremely clean and balanced.

francescopuddu
Автор

There is lot of background noise, which is really distracting.

NPEMSEC
Автор

Great Tutorial. Thanks Antonio. One quick question.Could you please explain how the normalization step is working in your code? it's the only ambiguous part for me. I would really appreciate your answers.

sinaziaee
Автор

Thanks for the efforts but who the heck is typing unmute?

VasudevaK
Автор

Thank you very much for this great course

MaryamSadeghi-uu
Автор

I am enjoying it very much, only the continuous keyboard typing or something like that annoying me.

abdurrahimsheikh
Автор

23:49 I don't get the normalization step, I checked the "degree" function that computes the degree of the edge_index and btw on the line "row, cow = edge_degree" we cannot get the row or the column, because this is the COO representation of the graph. Another thing that is strange is that after (somehow) you get deg_inv_sqrt, how it works for both source and target after you specify the [row] and [col] index in deg_inv_sqrt.??? Can you please explain the step 3 in more detailed way?

Автор

@ 21:57, may be the 'int' method should be 'init'? Since the initialization should be defined at the __init__ function.

congqi
Автор

These are really great! Is there any way that the typing sound could be removed in upcoming videos, though? I find it a little distracting.
Thanks so much for these videos! :)

jacanchaplais
Автор

next time ask the person who presses the buttons to leave - it makes it very difficult to listen

KonstantinKlepikov
Автор

@11:26, could you please explain what's "a", how did we get the 2Fx1? Is it randomly initialised weight vector?

josephsylvan
Автор

In pytorch geometric, when should I use a tuple as the in_channel? what exactly is the source and destination dimension in in_channels that the pytorch documents refer to?

akashdutta
Автор

34:53 I think it's a bit difficult to understand why you use -9e15 as mask value, if you don't explain that the attention matrix has to be softmaxed, so a such BIG number will be squished to zero. Don't you think it's better to mask the attention matrix AFTER the softmax, with the simple element-wise multiplication with adj?
Anyway, very interesting presentation!

carloaironi
Автор

That was so nicely explained, thank you ❣

expectopatronum
Автор

I have a question! Does GAT class provide an argument or function to turn skip-connection on for PPI dataset?

daekyoung_jung
Автор

Hi, Great video. However everybody talks about cora dataset, wherein nodes and edges can be easily defined.
My query is how can we perform node classification for data comprising of ECG signal, or tabular data with M observations and N features or say Images. How to begin with construction of graph for such datasets, defining nodes, then construction of graph's?

naimahmednesaragi
Автор

Hi it is really nice tutorial, but can I know specifically about masked attention?
I think it is related to adjacency matrix, if a position of an adjacency matrix is 1 then it goes to e(what we calculated)
if not, very little number(-9e+15).... Why we should use it?

hyunsikyun
visit shbcf.ru