LiteLLM Tutorial to Call Any LLM with API Locally

preview_player
Показать описание
This video shows how to install LiteLLM locally on Windows or Linux and call API of Anthropic, Huggingface, Cohere, TogetherAI, Azure, OpenAI, AWS, GCP, etc.

#litellm

PLEASE FOLLOW ME:

RELATED VIDEOS:

All rights reserved © 2021 Fahd Mirza
Рекомендации по теме
Комментарии
Автор

I found myself from not knowing that your channel existed, now finding that I frequently reference your content. This one is a key example, with the rise of open source llm models, I started using Ollama (yip you will find me commenting on your vid there), so a few days / months have passed and I need to consume both paid and my own models and use a single front end. And there you pop up. Tx for this. This helps makes sense of a very very very fast changing landscape.

DeonBands
Автор

For hosting a model on a local machine and exposing APIs for accessing that model what approach would you suggest ? Using LiteLLM and or LangChain ? Do you have a tutorial on that ?

arunavabhattacharya
Автор

Can I use liteLLM together with an OpenaiClient to "mimick" an openai model, but in fact it's using google gemini?

MrMoonsilver