Graph Neural Networks (GNN) | Nodes, Edges, Adjacency Matrix, Message Passing, Aggregation explained

preview_player
Показать описание
Welcome to the first lecture (Lecture 1) of our GNN project-based course. This lecture will give you a basic overview of GNN.

The best way to learn ML is by doing. The best thing to do is an impactful research paper. You might be a complete beginner. But don't worry. This course is exactly meant for you.

=================================================

Graph Neural Network (GNN) Lecture series + research is a project instructed by Ms. Aiswarya Nandakumar.

GNN lecture series is not a normal video course. In this project, we will teach GNN and conducting research in it from scratch. We will make lecture notes, and also share reference material.

As we learn the material, we will share thoughts on what is actually useful in industry and what has become irrelevant. We will also share a lot of information on which subject contains open areas of research. Interested students can also start their research journey there.

Students who are confused or stuck in their ML journey, maybe courses and offline videos are not inspiring enough. What might inspire you is if you see someone else learning and implementing machine learning from scratch.

No cost. No hidden charges. Pure old school teaching and learning.

=================================================

🌟 Meet Our Team: 🌟

🎓 Dr. Raj Dandekar (MIT PhD, IIT Madras department topper)

🎓 Dr. Rajat Dandekar (Purdue PhD, IIT Madras department gold medalist)

🎓 Dr. Sreedath Panat (MIT PhD, IIT Madras department gold medalist)

🎓 Ms. Aiswarya Nandakumar (Data Scientist, Entrepreneur and ML instructor)
Рекомендации по теме
Комментарии
Автор

I rarely comment on videos, but the quality of explanation is so good that it is worth it to appreciate! Waiting for the next video!

Fire_AJ_
Автор

Very clear explanation. Can you suggest some good resourses for learning Graph Networks? We need more lecturers like this.

HnM
Автор

thank you. I found this video very helpful. In particular i liked how you went through the message aggregation steps. In other videos i have watched, message aggregation is described with abstract maths notation that has left me still vague about what is happening. Also, great to see an up to date series of tutorials coming out. I can find a lot of tutorials from 3-4 years ago, but not much since then.

indigo
Автор

Great lecture! Covered everything from theory to math concepts.

anishj
Автор

Good example ( Sam) and simple syle. Anyone can follow it easly.

padmanabhannamboothiri
Автор

This is Vivek Karmarkar - really loved the visuals and the simplicity of the lecture!

physicsanimated
Автор

Very well explained and easy to understand. When are you guys releasing next video

devanshuagarwal
Автор

Thank you, Very well explained,
I have a small doubt, that at 22:06 in the node intialisation process, are the nodes intialised randomly or there is some method to do so?

khushbukhushbu
Автор

loved it but can you please share the slides also

rajehkumarmishra
Автор

This is really good. It would be great to know how to make message passing graph embedding for molecular structure and doing pooling operation to create single vector. will there be coding session in upcoming sessions.

kinglobby
Автор

Could you please also share some research papers, books, or other resources for reference?

anuragupperwal
Автор

Why arnt you considering the edge between a and d for aggregation?

lekshmip
Автор

kindly provide link to resources(slides and other documents shown in lectures) if possible.

RashFord-pf