Run Your Own Local ChatGPT: Ollama WebUI

preview_player
Показать описание
Today we learn how we can run our own ChatGPT-like web interface using Ollama WebUI.

◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾
📚 Programming Books & Merch 📚

💼 Services 💼

🌐 Social Media & Contact 🌐
Рекомендации по теме
Комментарии
Автор

Thank you for this, its fantastic. Would you be able to demonstrate installing a local llm to query your own documents? I have came across a number of tutorials for this but had not success running.

Jenko
Автор

Love it! Do you mind sharing the hardware list of your desktop/laptop running llama2 ? The speed looks great in your demo. Thanks!

cesoirg
Автор

You DON'T have to use Docker. There are some good reasons not to. But all the tuts are with docker.

chrisshelswell
Автор

When I open the webui in browser and log in there are no models and I have ollama installed with llama2. how do I get my models to show up in the web ui? Thanks

joekustek
Автор

pls make a videoon how can we fine tune oss model

devagarwal
Автор

hello
can I make this control my network and ask him about it to get the information localy

Al_Miqdad_
Автор

how did you download and where did you placed the model?
For me its blank since i didn't downloaded any model

anshulsingh
Автор

Awsome content as always, mr. Maximilian

alexanderv
Автор

Hi! Thanks for showing me obbama, it is really useful.

skkeei
Автор

That'a a bold rule to lay over me. And everyone here, for that matter 😄 Just go to a comment that you resonate with, will you?

Noobinski
Автор

Hi, can you share which video card you are using for this demo ?

louiscklaw
Автор

How did you get the models to show when you opened it Mine is completely empty. You did not show how you did that.

joekustek
Автор

Mine looks different, it installed and the icon is OI, not Llama. I can't load llama LLMs. hmm...

Electroxd
Автор

Thank for sharing 👍, …same installation steps to set up on cloud instances….? ? 👨🏽‍💻

Dz-Hub-lltr
Автор

Hey so I’m trying to write an Alexa task that will provide a conversational UI w/ offline LLM (my usecase is crisis relief workers in areas with limited / downed connectivity).

Would the VoiceGPT extension work with Ollama WebUI? Also, is there a risk rating for wrong results for the lighter 2B or less models?

XiangYu
Автор

Hey there, im asking if i can remove the register button because mine its a private ai and i dont wanna other people using my pc as ai. can you help me?

ghostandry
Автор

Where is the link to the docker website?! How am I supposed to do anything if the link isn't even there?!

MrStellateWaffle
Автор

hi can we able to deploy this model with UI on any platforms like github or smtng else

amanreddypundru
Автор

can you give it dataset/knowledge source

HasimFN
Автор

can you do a video to how to install it one computer and acess it via wifi

yasiruperera