Ollama added Windows support to run local LLM easily - No GPU needed

preview_player
Показать описание
Any Windows PC can now run local open source large language model on Windows Using Ollama. #ollama #llm #localai
👉ⓢⓤⓑⓢⓒⓡⓘⓑⓔ
No additional dependencies.
Just download the Ollama program.

Affiliate links: buy hardware on Amazon

Thank you for watching!
Рекомендации по теме
Комментарии
Автор

To test better this kind of the model, I usually ask it to write me 2000 or 4000 words article about something.
This will better show you what it actually uses (CPU or GPu and how much and how long)

DS-pkeh
Автор

Ihave a issue that at first time when i integrated llama in it. Was fine then after some time i opened cmd and run llama then. It was worstly slow.

riyan
Автор

Please upload a full tutorial for using stable diffusion on ryzen 5 5600g windows

anaskhan-lzhk
Автор

how to let ollama use dedicatd gpu on your laptop ? I have an AMD one.

adityamaheshwari