Setting environment variables for Ollama on Windows

preview_player
Показать описание
The config for Ollama is done with Environment Variables. Here is how to do that on Windows

My Links 🔗

Рекомендации по теме
Комментарии
Автор

Another option in cases like this is to use the windows mklink command, that way you can have your files scattered anywhere you like and just make links to files (or junctions to directories) and pull everything together into the default folder. I haven't tried this with ollama but I can't imagine it would cause any problems unless there are corner cases I don't know about.

localminimum
Автор

Its Friday, its Matt, its bloody perfect

tiredofeverythingnew
Автор

Can you make another video but for linux? because in my linux when I change the directory of my models in "ollama.service" with the command "sudo systemctl edit ollama.service
", I need to start "ollama serve" in the termenal every time I want to use the models

AhmedAyman-kicm
Автор

There is something to be said for simplicity. A folder with GGUF files would be more compatible and easier to maintain. As of now, we have to keep an entire duplicate set of GGUFs for LM Studio, etc.

techguy
Автор

I don't want to download models twice, since they consume a lot of space, how can I share them with LM Studio GGUF models by using environment variables and symbolic linking in Windows?

mayorc
Автор

Unable to connect dify to Ollama for Windows!
What should I put OLLAMA_HOST value in Setting environment variables on Windows?

mohammadkondori
Автор

Thanks Matt. I've been trying to get my Windows native oLLama install to allow connections from other machines on my network, I knew I had to pass environment variables but had no idea how to do it in Windows. Hopefully this technique will work.

JoelGriffinDodd
Автор

yesterday i'm dealing with that problem, thank you

codeoconloscodos
Автор

What was wrong with keeping the settings in the .conf file.

vulcand
Автор

Moving from WSL2 to Windows: copying and pasting blobs and manifests folder doesn’t work. Native ollama doesn’t see the files copied from WSL2. How does one go about migrating a large set of models without re-downloading? I tried changing colons to dashes and that didn’t work either.

techguy
Автор

For some reason, I can't get autocompletion to work for any Ollama extensions in VS Code. The only one was kind of working is Continue but it was horrible and doesn't even add proper tab indentation for produced code for Python.

RebelliousX
Автор

thanks, but what about old downloaded models ?
Can move it manually or what ?

AliAlias
Автор

Interesting. Any idea how to permanently reduce the windows task priority under windows? It stops the system and times out RDP till it finishes processing. I've set the priority to low.. but it frequently auto changes it back to "above normal" from "below normal" and the system goes to crap again. :( Any ideas?

VKFVAX
Автор

Thank you i was testing ollama and is really great an easy to install and is fast enough even in my potato machine😀

geomorillo
Автор

can you do a video on setting up ollama on windows with the ollama web ui?

TheGladScientist
Автор

Thanks Matt, I started using this native windows version a couple of days ago, I was using Linux (WSL in fact), but wanted to avoid the tunneling settings, etc. Anyway, my issue is that all runs so slow, as in taking an hour to answer a question, while in the same laptop with WSL was way faster, could take 3-5 min instead (not great, but this is what I expected)... All running in CPU... The main difference is that with WSL the machine was using 5GB of the available RAM, while for native windows it only uses 300MB of RAM, would you know of a way to give it more memory?

joser
Автор

Matt, I have an AMD integrated radeon graphics card. When I run Ollama on windows it defaults to using my CPU and it's quite slow. How can I make it use the GPU instead ?

a-blaze
Автор

I’m having trouble with the port assignment with Ollama for Windows. It keeps closing and denying access to the port. Maybe because I have WSL version of Ollma installed on the same machine? Need to check the discord. Anyway, I really like your videos. Entertaining and informative. Keep it up. I’m a subscriber

jrfcs
Автор

you can use mklink /j command to create hard link to any place without playing with variables

maybeHardDog
Автор

Used to run it on docker on wsl and it was super slow, now on windows it's super fast 🤘🏻

shmloney