Ollama | Easiest way to run Local LLM on mac and linux

preview_player
Показать описание
Get up and running with large language models, locally with OLLaMa

Notes:
The easiest way to run local LLMs. We’ll find out.
Currently only Mac and Linux are supported. Windows coming soon

Commands:
ollama run llama2
Рекомендации по теме
Комментарии
Автор

Bruh.... Make the audio stereo please.

alvaro
Автор

Is there any specific config you need to do in order or Ollama to use GPU on your Mac?

erictong
Автор

Great video, Does this also have api features so I can use it for home assistant open ai capabilities (assist) with this easy setup. ?

johnnijakobsen
Автор

Can you please build this with chat UI ?? Something like ChatGPT interface to remember history of conversation and displays newest conversation at the bottom of chat??

SanataniAryavrat