How to Run Any GGUF AI Model with Ollama Locally

preview_player
Показать описание
This video is a step-by-step tutorial to install and run any LLM in GGUF format with Ollama locally.

🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:

Coupon code: FahdMirza

#gguf #ollama

PLEASE FOLLOW ME:

RELATED VIDEOS:

All rights reserved © 2021 Fahd Mirza
Рекомендации по теме
Комментарии
Автор

Thank for your tutorial. I follow only you and i learn more about u

TheYuriTS
Автор

how can i fix Error: this model is not supported by your version of Ollama. You may need to upgrade and i already upgrade the latest ver

ZVCi
Автор

I didn't even know 48GB VRAM Nvidia existed for personal use.

chandraprakash