filmov
tv
JSON Output with Notus Local LLM [LlamaIndex, Ollama, Weaviate]

Показать описание
In this video, I show how to get JSON output from Notus LLM running locally with Ollama. JSON output is generated with LlamaIndex using the dynamic Pydantic class approach.
Sparrow GitHub repo:
Argilla Notus repo:
0:00 Intro
0:25 Notus in Sparrow
1:54 Ingest with Weaviate fix
2:40 JSON output with LlamaIndex
4:02 Sparrow engine
5:22 Example
7:15 Summary
CONNECT:
- Subscribe to this YouTube channel
#llm #rag #python
Sparrow GitHub repo:
Argilla Notus repo:
0:00 Intro
0:25 Notus in Sparrow
1:54 Ingest with Weaviate fix
2:40 JSON output with LlamaIndex
4:02 Sparrow engine
5:22 Example
7:15 Summary
CONNECT:
- Subscribe to this YouTube channel
#llm #rag #python
Комментарии