Run LLMs locally using OLLAMA | Private Local LLM | OLLAMA Tutorial | Karndeep SIngh

preview_player
Показать описание
The video explains how to run llms locally using OLLAMA Fast and Easy. The following are topics covered in the video:
1. OLLAMA installation on Mac.
2. Download and use LLMs Models in OLLMA.
3. Customize OLLAMA Modelfile for determining model parameters and system prompts.
4. Understanding different CLI commands in OLLMA.

Connect with me on :

Creative Commons CC BY-SA 3.0

#ollama #mac #llms
Рекомендации по теме
Комментарии
Автор

Namaste sir
I am a blind person.Want to learn data labelling from scratch to expert level. Plz guide how to get started practically.

arindammazumdar