JSON Output with Notus Local LLM [LlamaIndex, Ollama, Weaviate]

preview_player
Показать описание
In this video, I show how to get JSON output from Notus LLM running locally with Ollama. JSON output is generated with LlamaIndex using the dynamic Pydantic class approach.

Sparrow GitHub repo:

Argilla Notus repo:

0:00 Intro
0:25 Notus in Sparrow
1:54 Ingest with Weaviate fix
2:40 JSON output with LlamaIndex
4:02 Sparrow engine
5:22 Example
7:15 Summary

CONNECT:
- Subscribe to this YouTube channel

#llm #rag #python
Рекомендации по теме
Комментарии
Автор

Is it possible to bolt on Microsoft autogen on top of your sparrow all local framework ? Multi agent framework to interact with local llm, local llamaindex, local weaviate, local pydantec, etc.

faridullahkhan
Автор

Why two url localhost for ollama, and any reason if these two doesn't work?

shivanidwivedi
Автор

If i ran the same command i get error of 404 url not found

shivanidwivedi