RAG with LangChain & Ollama Locally and For Free

preview_player
Показать описание
In this video, we'll build a RAG app using Ollama and an embedding model locally and for free. We'll track this app in LangSmith.

00:01 Introduction
01:08 Create a virtual environment
01:38 Installation
03:17 Initialize the local model
04:23 Enter LangSmith
07:28 Load data
09:14 Split data
11:50 Create a database
13:56 Retrieve data
15:35 Generate the output
20:23 Summary

▶️ LangChain Tutorials:
▶️ Generative AI Tutorials:
▶️ LLMs Tutorials:
▶️ HuggingFace Tutorials:

🔥 Thanks for watching. Don't forget to subscribe, like the video, and leave a comment.

#ai #langgraph #generativeai
Рекомендации по теме
Комментарии
Автор

I wanted to express my gratitude for your recent video, "RAG with LangChain & Ollama Locally and For Free." Your clear explanations and the inclusion of the Notebook file code have significantly aided my understanding of all steps. Your effort to make complex topics accessible is greatly appreciated.👍

TooyAshy-
Автор

I just found you this morning. One of the best channels out there, so thanks for all your excellent videos. One thing I don't understand is how LangSmith is able to capture what's going on in the notebook. Is Ollama sending data to LangSmith?

mapledev
Автор

hardware specs plz? already subscribed ur channel for more easy tutorials. awesome work.

aliaffan
Автор

i install langchain-community but when i import, it shows error

tsheringwangchuk
Автор

i didn't get the api. can you show some steps for that

tsheringwangchuk
Автор

Hocam bu serinin türkçesi gelemez mi ?

emindurmus
Автор

how can i solve this problem
AttributeError: module has no attribute ollama

tsheringwangchuk