Generative AI 03: Offline LLMs using #Ollama commandline and python

preview_player
Показать описание
In this video
3- Command line tool: list, pull, run, serve
4- Ollama APIs, changing default port on windows
5- Using python package in vscode
- List, pull, run, ...
- Streaming response
- Custom client

Рекомендации по теме
visit shbcf.ru