3 ways to interact with Ollama | Ollama with LangChain

preview_player
Показать описание
In this video, I showcase three ways you can interact with Ollama models running locally. Run LLM model locally with Ollama and how you can use them with LangChain.

Timestamps:
0:00 intro
0:35 what is Ollama?
1:50 via command line
2:52 via the API (Postman)
4:30 using with LangChain
11:15 outro

Resources:

Support this channel:

Connect with me:
Рекомендации по теме
Комментарии
Автор

Yes we need a tutorial of using ollama through docker!

SubinKrishnaKT