Retrieval Augmented Generation (RAG) Explained: Embedding, Sentence BERT, Vector Database (HNSW)

preview_player
Показать описание

In this video we explore the entire Retrieval Augmented Generation pipeline. I will start by reviewing language models, their training and inference, and then explore the main ingredient of a RAG pipeline: embedding vectors. We will see what are embedding vectors, how they are computed, and how we can compute embedding vectors for sentences. We will also explore what is a vector database, while also exploring the popular HNSW (Hierarchical Navigable Small Worlds) algorithm used by vector databases to find embedding vectors given a query.

Chapters
00:00 - Introduction
02:22 - Language Models
04:33 - Fine-Tuning
06:04 - Prompt Engineering (Few-Shot)
07:24 - Prompt Engineering (QA)
10:15 - RAG pipeline (introduction)
13:38 - Embedding Vectors
19:41 - Sentence Embedding
23:17 - Sentence BERT
28:10 - RAG pipeline (review)
29:50 - RAG with Gradient
31:38 - Vector Database
33:11 - K-NN (Naive)
35:16 - Hierarchical Navigable Small Worlds (Introduction)
35:54 - Six Degrees of Separation
39:35 - Navigable Small Worlds
43:08 - Skip-List
45:23 - Hierarchical Navigable Small Worlds
47:27 - RAG pipeline (review)
48:22 - Closing
Рекомендации по теме
Комментарии
Автор

This is what a teacher with a deep knowledge on what is teaching can do. Thank you very much.

nawarajbhujel
Автор

Wow! I finally understood everything. I am a student in ML. I have watched already half of your videos. Thank you so much for sharing. Greetings from Jerusalem

tryit-wvui
Автор

This is one of the best explanation i ever seen in youtube.... Thank you.

faiyazahmad
Автор

Man, your content is awesome. Please do not stop making these videos as well as code walkthroughs.

wilsvenleong
Автор

You are the best teacher of ML that I have experienced. Thanks for sharing the knowledge.

ramsivarmakrishnan
Автор

Best explanation I found on YouTube, thank you!

bevandenizclgn
Автор

Wow, thanks a lot. This Is the best explanation on RAG I found on YouTube

sarimhashmi
Автор

The way you've explained all these concepts has blown my mind. I won't be surprised to see your number of subscribers skyrocket. Channel Subscribed !!

mohittewari
Автор

Amazing !! I finally understood everything. Good Job, all your videos have in-depth understanding

venkateshdesai
Автор

and learning becomes more interesting and fun when you have an Teacher like Umar who explains each and everything related to the topic so good that everyone feels like they know complete algorithms.
A big fan of your teaching methods Umar.. Thanks for making all the informative videos..

DeepakTopwal-slbw
Автор

Amazing teacher! 50 minutes flew by :)

Rockermiriam
Автор

Waited for such content for a while. You made my day. I think I got almost everything. So educational. Thank you Umar

redfield
Автор

What an exceptional explanation of HNSW algo ❤

sumansan
Автор

Impressively intuitive, something most explanations are not. Great video!

alexsguha
Автор

This was fantastic and I have learned a lot from this! Thanks a lot for putting this lesson together!

alexandredamiao
Автор

Thanks Umar. I look forward for your videos as you explain the topic in an easy to understand way. I would request you to make "BERT implementation from scratch" video.

tahirrauf
Автор

This was fantastic (as usual). Thanks for putting it together. It has helped my understanding no end.

JRB
Автор

One of the best channels to learn and grow

ypbgbvt
Автор

Awesome context sir, it was the best explanation I found till now!

kiranshenvi
Автор

Hello sir i just want to say thanks for creating very good content for us. love from India :)

vasoyarutvik