Finally Ollama has an OpenAI compatible API

preview_player
Показать описание
A user-contributed PR brings us an OpenAI API to Ollama.

Рекомендации по теме
Комментарии
Автор

Yes, please. Autogen + Ollama with agents each using a different model. Please mention deployment/resource considerations. Love your videos.

wadejohnson
Автор

This video is just 10 minutes but it covered hours of content if you want to reproduce same results from scratch and you are new to these tools. I like this way. It not only helps to have your needs fulfilled but also understand what are some relevant tools that might help you.

fslurrehman
Автор

Thank You
Would love to see more LangGraph and CrewAI content... but I think you just explained perfectly what I need todo

johngoad
Автор

excellent topic and great delivery. Keep up the good work!

shonnspencer
Автор

Your videos are so great. Informative and clear. Definitely my favorite on AI topics. Thanks!!

jayk
Автор

Flowise plus Ollama with multilingual embedings. Thanks for your videos!

ЭдуардРа
Автор

Ha, I love your humor! I came for the great AI info, but stayed for your personality. :)

tristanbob
Автор

using ollama and litellm worked quite good for me for generating the oai api endpoint for autogen

Unicron
Автор

Came for Ollama, stayed for Bluey. Nice Video 😊

laikastoq
Автор

Love your persona!! Phenomenal content!!

aimademerich
Автор

Are function calling supported in the OAI compatibility?

magick
Автор

came for the knowledge, stayed for the humor

myronkoch
Автор

What API Matt providing in the Autogen's new model creation (5:45 time frame)? and how to get that API?

things
Автор

Great video - pew pew ... pew pew pew...

jonm
Автор

Awesome Video! Very Infomative!
Can you tell me how you added api-key to ollama server. Would be helpful.
Thanks

debarghyamaity
Автор

Very informative with lots of subtile humor. You earned a sub from me! Thanks fir sharing. I have been waiting on this update.

MadsVoigtHingelberg
Автор

Keep it up. Your videos are informative and entertaining. Would like to see the best way to use Ollama on a Windows machine.

jrfcs
Автор

Great video! Noob question, for autogen and ollama, do I need to pass the ollama run command first or does it serve automatically? I can see ollama is running in a web browser, just not sure if I need to run the model in a terminal first. Thanks

Royaltea_Citizen
Автор

Wow Wonderful nice and crisp. Yes, please AUTOGEN + Ollama with Multiple agents with deployment -> Waiting for it !!! Thanks in Advance.

roopeshk.r
Автор

Thank you. Great video. More Autogen and Ollama. Please.

AA-pwbk