Replace OpenAI API with local models: Ollama + LiteLLM, text gen webui, google colab

preview_player
Показать описание
I walk through my 3 favourite methods for running an OpenAI compatible api powered by local models: Ollama + Litellm, Text Generation WebUI and google colab

You can sign up for the next intake of the course I mentioned at

00:00 Introduction
00:29 A simple script
01:02 Ollama + LiteLLM
03:57 textgen-webui
09:31 Google colab + textgen-webui
Рекомендации по теме
Комментарии
Автор

Thanks for the content! i enjoyed a OpenAI / ChatGPT video for the first time in months, simple and informative, it's really appreciated. I've been in the search of replacin the OpenAI API token with running my model locally and directing my apps to it with 0 luck. All i want is to run a chatbot on wordpress with a local LLM usin ollama. Good thing dreaming it's still free :)

paritaistudio
Автор

I which I can easily use ollama together with the power of text-generation-webui project.

MelroyvandenBerg
Автор

Thanks for the video! Got the model working in colab webui but the cloud flare link is broken somehow. Thanks for all your work to make this video. Cheers

forestpeoplemushrooms
Автор

Great tutorial. Is there a method to use Ngrok and Litellm on the Google Colab side and then connect from local machine to it? This would make Open Interpreter powerful for any machine.

goonie
Автор

Thanks great !
Whats the configuration of your system?

mohsinsheikh
Автор

Thanks for sharing JV! Can you share your local setup too (in the vids or in a linked video) so people like myself can get a realistic measure of what we can achieve at home. (you touched on it a bit in this video, but any additional info would be :chefskiss:

rich.fortune