Attention for Neural Networks, Clearly Explained!!!

preview_player
Показать описание
Attention is one of the most important concepts behind Transformers and Large Language Models, like ChatGPT. However, it's not that complicated. In this StatQuest, we add Attention to a basic Sequence-to-Sequence (Seq2Seq or Encoder-Decoder) model and walk through how it works and is calculated, one step at a time. BAM!!!

If you'd like to support StatQuest, please consider...
...or...

...buying my book, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...

...or just donating to StatQuest!

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:

0:00 Awesome song and introduction
3:14 The Main Idea of Attention
5:34 A worked out example of Attention
10:18 The Dot Product Similarity
11:52 Using similarity scores to calculate Attention values
13:27 Using Attention values to predict an output word
14:22 Summary of Attention

#StatQuest #neuralnetwork #attention
Рекомендации по теме
Комментарии
Автор

“Statquest is all you need” — I really needed this video for my NLP course but glad it’s out now. I got an A+ for the course, your precious videos helped a lot!

koofumkim
Автор

Somehow Josh always figures out what video are we going to need!

atharva
Автор

The level of explainability from this video is top-notch. I always watch your video first to grasp the concept then do the implementation on my own. Thank you so much for this work !

MelUgaddan
Автор

for this video attention is all you need

nikolamarkovic
Автор

Dang this came out just 2 days after my neural networks final. I’m still so happy to see this video in feed. You do such great work Josh! Please keep it up for all the computer scientists and statisticians that love your videos and eagerly await each new post

dylancam
Автор

I was literally trying to understand attention a couple of days ago and Mr.BAM posts a video about it. Thanks 😊

rutvikjere
Автор

The amount of effort for some of these animations, especially in these videos on Attention and Transformers in insane. Thank you!

sameepshah
Автор

Can’t thank enough for this guy helped me get my master degree in AI back in 2022, now I’m working as a data scientist and still kept going back to your videos.

lunamita
Автор

This channel is pure gold. I'm a machine learning and deep learning student.

SharingFists
Автор

This is awesome mate, can't wait for the next installment! Your tutorials are indispensable!

clockent
Автор

Hi mr josh, just wanna say that there is literally no one that makes it so easy for me to understand such complicated concepts. Thank you ! once I get a job I will make sure to give you guru dakshina! (meaning, an offering from students to their teachers)

sinamon
Автор

Great work, Josh! Listening to my deep learning lectures and reading papers become way easier after watching your videoes, because you explain the big picture and the context so well!! Eagerly waiting for the transformers video!

Travel-Invest-Repeat
Автор

This is the best explanation ever, not only in this video, but the entire Thanks a lot...

OsamaAlatraqchi
Автор

I was just reading the original attention paper and then BAM! You uploaded the video. Thank you for creating the best content on AI on YouTube!

aquater
Автор

1 million subscribers INCOMING!!!
Also huge thanks to Josh for providing such insightful videos. These videos really make everything easy to understand, I was trying to understand Attention and BAM!! found this gem.

aayush
Автор

The BEST explanation of Attention models!! Kudos & Thanks 😊

ncjanardhan
Автор

The best explanation of Attention that I have come across so far ...
Thanks a bunch❤

ArpitAnand-ydtr
Автор

I just wanna let you know that this series is absolutely amazing. So far, as you can see, I've made it to the 89th video, guess that's something. Now it's getting serious tho. Again, love what you're doing here man!!! Thanks!!

benmelis
Автор

Hello Statquest, I would like to say Thank You for the amazing job, this content helped me understand a lot how Attention works, specially because visual things help me understand better, and the way you join the visual explanation with the verbal one while keeping it interesting is on another level, Amazing work<3

brunocotrim
Автор

The end is a classic cliffhanger for the series. You talk about how we don't need the LSTMs and I wait for an entire summer for transformers. Good job! :)

usser-