How To Run ANY Open Source LLM LOCALLY In Linux

preview_player
Показать описание
In this video, I will show you how to run ANY open source llm (large language models) locally on Linux using Ollama & LMStudio. Ollama & LMStudio are the best tools that allows you to run various models such as llama3, Gemma, Mistral, codellama & much more. Watch this video and learn running LLMS locally on Linux computer.

*Timestamps*
00:00 Introduction
00:38 Pre-requisites 
01:16 Installing Ollama
02:18 Download LLM
03:01 Testing LLAMA3 & Gemma
05:31 Customizing Model
06:55 Installing LMStudio

*Download*

*Relevant Tech Videos*

~ *Connect On Instagram* - @KSKROYALTECH

*© KSK ROYAL* 
    *MereSai*
Рекомендации по теме
Комментарии
Автор

For anyone having trouble with the 'ollama create', you have to spell the model's name in lowercase according to the ollama documentation. So the first line would be 'FROM llama3'

xdaddev
Автор

The ollama system configuration is very iseful for agentic workflows
Need to learn to make llms talk to each other

shrirammadurantakam
Автор

Alpaca is best for llm GUI. its on flatpak as well. Clean & simple UI.

wolfisraging
Автор

Very nice Video 🙏🏼
Bur is There also a free ki you can Host localy for picture Generation? Maybe this would be a Video Worth 😊🙌🏼 I would be interested 💯🙌🏼

ZerYT
Автор

LLM local running is cool but what is the best training set?

seventhtenth
Автор

which is your main system for you work also what are you doinng in you life like in education pov

Amit-hbex
Автор

Does this work in case if there is no internet

jivtheshm.r
Автор

its powerful then the chat gpt 4o or not capable

abhidnyasonawane
Автор

Can you guide us setting up ComfyUI as web ui, ComfyUI+Krita and ComfyUI+Blender. I was able to setup and use ComfyUI as web ui, ComfyUI+Krita, but when i try to set up ComfyUI in blender i get some error message.
Os - Garuda
System - Fully AMD

rayrai
Автор

Is there a possibility to host locally AI but be able to access it remotely by phone?
Preferable something outsite of using teamviewer.. it's pretty uncomfortable.

Yowise-qk
Автор

How to uninstall ollama from my computer. I have no graphics card

PTRAARON
Автор

you installed llama 2 times in this video one from cli and one from the lm studio can i remove the cli version to save space if so then please tell me how

drift
Автор

Do what exactly makes Linux superior for AI?
You do realise that you can run Ollama & LM Studio just as easily on macOS & Windows. Not to mention, they also work with AMD GPUs, not just Nvidia.

chef
Автор

hey, how do one can delete a model from oolama?

Arador