Ollama adds OpenAI API support

preview_player
Показать описание
Ollama became OpenAI API compatible and all rejoiced...well everyone except LiteLLM! In this video, we'll see how this makes it easier to compare OpenAI and open-source models and then we'll update a ChainLit app that I built with LiteLLM and Ollama to use Ollama via the OpenAI library instead.

Рекомендации по теме
Комментарии
Автор

This was good, very concise. Unfortunately when I try using the openai api in python to access an ollama server running on a _different_ machine (not just 'localhost') I get "Connection refused" despite 2hrs of trying to vary every possible factor I could think of. Really frustrating but I'll keep hammering away at it tomorrow.

perspectivex
Автор

Sir how we work with this on VScode, Please reply me it is required in my internship

shobhitagnihotri
Автор

Very nice channel, your videos and tutorials are really great! Thanks for sharing.
Let's comment and subscribe, Mark deserves to have 100M subscribers!

bithigh