Enhancing RAG: LlamaIndex and Ollama for On-Premise Data Extraction

preview_player
Показать описание
LlamaIndex is an excellent choice for RAG implementation. It provides a perfect API to work with different data sources and extract data. LlamaIndex provides API for Ollama integration. This means we can easily use LlamaIndex with on-premise LLMs through Ollama. I explain a sample app where LlamaIndex works with Ollama to extract data from PDF invoices.

GitHub repo:

0:00 Intro
0:57 Libs
1:46 Config
2:58 Main script
4:43 RAG pipeline
7:02 Example
8:09 Summary

CONNECT:
- Subscribe to this YouTube channel

#ollama #rag #llamaindex
Рекомендации по теме
Комментарии
Автор

The only video on this planet which shows everything clearly! Absolutely perfect.

AbhishekShivkumar-tiru
Автор

I have one question: is there a way to "save" the index made for the PDF files so that the index can be reused each time the script is run?

tejaslotlikar