filmov
tv
Generative AI 03: Offline LLMs using #Ollama commandline and python

Показать описание
In this video
3- Command line tool: list, pull, run, serve
4- Ollama APIs, changing default port on windows
5- Using python package in vscode
- List, pull, run, ...
- Streaming response
- Custom client
3- Command line tool: list, pull, run, serve
4- Ollama APIs, changing default port on windows
5- Using python package in vscode
- List, pull, run, ...
- Streaming response
- Custom client