Building a local ChatGPT with Chainlit, Mixtral, and Ollama

preview_player
Показать описание
In this video, we'll learn how to build a mini ChatGPT that runs on our machine using Mixtral, Ollama, llmlite, and Chainlit. We'll build it up from scratch, starting with a chatbot that echos back messages, before moving onto one that remembers the chat history and using Mixtral to answer questions. And finally, we'll add functionality that lets the chatbot answer questions on uploaded text files.

#llms #ollama #litellm #chainlit

Рекомендации по теме
Комментарии
Автор

For your file or files of journal entries, would you need to use some form of retrieval augmented generation (RAG) system, either a vector or graph database depending on the details. Or is your method to add as much as fits into the prompt context window? It is an interesting question...for the questions you asked about the 'database' I wonder what kind of method might work...(as a standard graph or vector database probably would not do so well with those questions....must be a way....very interesting.

geoffreygordonashbrook
Автор

that's interesting and I've never heard about Chainlit before. Does it store the uploaded files to re-use them in future? Like Retrieval Tool in Assistants API by OpenAI

alizhadigerov
Автор

Let's example: I have one data set which contains one column (text data), want to train a model with this data, I will test the another data which gives nearly similar meaning.


Can u please suggest how can I solve this problem.?

gurupavankalyanbandaru
Автор

One flaw I found with Chainlit is that unike Gradio, it does not have native support for concurrency and multi-user access... maybe someone with more time on their hands could try adding these functionalities to Chainlit as a PR?

mirek
Автор

I have a problem when i try to receive a response. The error says "All connection attmeps failed", i supose is that on my localhost, the port 11434 is already in use, what should i try?

davidpujo
Автор

Not sure why but I'm getting the error "AttributeError: 'NoneType' object has no attribute 'decode'" on line 17 of chat.py when attempting with an attached txt file.

robertcmcdermott
Автор

News... Finally we can remove the LiteLLM from every app (as it was just a-bump-on-the-road) as now ollama updated their API so now it is OpenAI compatible API endpoint>

HyperUpscale
Автор

It didnt work for me. I got an error raise OllamaError(
b'{"error":"model \'mixtral\' not found, try pulling it first"}'

kirthiramaniyer