An Introduction to Graph Neural Networks: Models and Applications

preview_player
Показать описание
MSR Cambridge, AI Residency Advanced Lecture Series
An Introduction to Graph Neural Networks: Models and Applications

Got it now: "Graph Neural Networks (GNN) are a general class of networks that work over graphs. By representing a problem as a graph — encoding the information of individual elements as nodes and their relationships as edges — GNNs learn to capture patterns within the graph. These networks have been successfully used in applications such as chemistry and program analysis. In this introductory talk, I will do a deep dive in the neural message-passing GNNs, and show how to create a simple GNN implementation. Finally, I will illustrate how GNNs have been used in applications.

Рекомендации по теме
Комментарии
Автор

*My takeaways:*
1. Background 0:48
2. Graph neural networks (GNN) and neural message passing 6:35
- Gated GNN 26:35
- Graph convolutional networks 29:27
3. Expressing GGNNs as matrix operations 33:36
4. GNN application examples 41:25
5. Other models as special cases of GNNs 47:53
6. ML in practice 49:28

leixun
Автор

At time 34:11, the (dot product of matrix A and matrix N) should be [ b + c ; c ; 0 ]

iltseng
Автор

Don't know why people are criticizing this video and the audience. Great introduction to graph neural networks!

syllogismo
Автор

So GNNs are basically something like calculating word embeddings in NLP. We have a dataset describing the relationships between pairs of words (nodes), and we want a vector representation that reflects how often they co-occur (weight of the edge between the nodes), i.e., how much relatedness the two words have. Once we have such vectors, we can build a vanilla, recurrent, or convolutional neural net to find out a mapping between the vectors and the output we desire.

susmitislam
Автор

"Spherical Cow" - funniest analogy yet for a Neural Net layers. Great talk

rembautimes
Автор

great talk! the audience questions were helpful, but i felt like they were a bit too many in that they kinda negatively affected the flow of the talk.

mehmetf.demirel
Автор

Wow, I saw a bunch of other videos about GNN's and totally missed the point how they really worked. Great presentation, thanks a lot!

netional
Автор

awesome talk! The MSR audience asked quite a few questions, which are actually helpful, eg, what are they, how they work/update, why they are created and designed this way, etc

runggp
Автор

The explanation ability and use of high-level diagrams by the presenter were phenomenal. Questions from the audience definitely messed up the flow of the presentation quite a bit though.

ashutoshukey
Автор

In 16:51, I think he meant for each node connecting to n (instead of n_j), because from the expression, we take all nodes n_j connected to n to be able to calculate the new state of node n h_t^n.

alphaO
Автор

Wow, what an excellent presentation, from someone with an ML background. Explains the basics a bit but also covers deep concepts. Super clear graphics! Seriously whoever made the graphics for this can I hire you to do my slide graphics? And thought it was very cool that the lecture attendees were bold enough to ask so many questions! Wish people asked more questions during my lectures+talks.

Peaceluvr
Автор

While the audience questions were mildly irritating (to put it, mildly), bombarding the speaker during his intro with questions that could reasonably be expected to be answered eventually in an 1-hour talk, why would the speaker give a talk on one of the most advanced neural network architecture to an audience without any machine learning background?

MobileComputing
Автор

The miss, at 40:00 was right .... as i was alsoooo really confused, like all the matrix operations were seemed to be invalid if not swapped ... lol what kind of inverted conventions are these ....

codewithyouml
Автор

This is why you have the Q&A at the end of the presentation.

rarelycomments
Автор

29:35 about CGNs, he said you multiply the sum of the messages with your own state. But in the equation, it is a sum. I didn't get which one is correct.

Exhora
Автор

Where can we get the slide deck please?

sm_xiii
Автор

He explains using time progress, which make some cofusion to the audience and me.

heejuneAhn
Автор

36:53 what is M in the shape (num_nodes by M)?

lidiias
Автор

35:40 I think the dimensionality of M should be (num_nodes x D), unless D==M.
EDIT: from what follows, it should be M = HE, and D can be different from M.

mansurZ
Автор

Actually, I read the original research paper <The graph neural network model BY : Franco Scarselli at 2009 > on this network, which established a new field in artificial intelligence. I found slight differences between the speaker's words and the research paper, then it became clear to me that he was talking in general about this neural network. In general, the audience's performance was unsatisfactory, and there were many interruptions that made the person confused and disorganized. The last thing I would like to say is that this field was found in the research paper of 2008, which was published in 2009. The researchers said that it is a network resulting from the features of RNN and its strengths and the idea of ​​working with Markov models, then it was wrapped in the concepts of graph theory.

M-wy
welcome to shbcf.ru