LlamaIndex 22: Llama 3.1 Local RAG using Ollama | Python | LlamaIndex

preview_player
Показать описание
LlamaIndex 22: Llama 3.1 Local RAG using Ollama | Python | LlamaIndex

About this video: In this video, you will learn how to crate RAG from scractch in LlamaIndex

Large Language Model (LLM) - LangChain

Large Language Model (LLM) - LlamaIndex

Machine Learning Model Deployment

Spark with Python (PySpark)

Data Preprocessing (scikit-learn)

Social Media Links

#llamaindex #openai #llm #ai #huggingface #api #genai #generativeai #statswire
Рекомендации по теме
Комментарии
Автор

Thank You ! Yet Again !! I did use Ollama 'llama3.1:8b' and it answered several queries quite well !

davidtindell
Автор

I confirmed that the query did indeed make use of my local NVidia GPU and so was fairly quick but not very fast !

davidtindell
Автор

I'm GEETING ERROR The `__modify_schema__` method is not supported in Pydantic v2. Use instead in class `SecretStr`. How to solve it ?

bossganabathivellow
Автор

So when we execute llm = Ollama(model="llama3.1", request_timeout=420.0), is this mean that we need to deploy Ollama in local PC and pull llama3.1?

Donovan-pi
Автор

Im having compatibility issues, could you share your python environment or your libraries' versions?

ackerj
Автор

There's a step that i may have missed. Do you have the LLama 3 model installed on your local machine?

pythonantole
Автор

I downloaded the model using ollama on my internet system but how to move the model files to intranet environment?Please help

kashyapatom
Автор

Hii,

I am not getting the response.
Getting connection refused error.

Please help to solve this

saraswathinatarajan
Автор

Doew it work with pdf images of charts and tables

rahulsh
join shbcf.ru