RUN LLMs Locally On ANDROID: LlaMa3, Gemma & More

preview_player
Показать описание
RM LLMs Locally On Android device using Ollama. Ollama is simple tool that allows running open source models like llama3, Gemma, tinyllama & more.

*Downloads*

*Command List*
termux-setup-storage
termux-change-repo
pkg upgrade
pkg install git cmake golang

_*Setup Ollama*_
cd ollama
go generate ./...
go build .
./ollama serve &

*Watch MORE Tech Videos*

~ *Connect On Instagram* - @KSKROYALTECH

*© KSK ROYAL* 
    *MereSai*
Рекомендации по теме
Комментарии
Автор

local llms will be the shit one day... imagine an ai assistant with voice and stuff offline on your phone which you can fully trust. What a dream.

laberbla
Автор

followed each step. llama3.2 is insane and is fast!❤🎉

stevenmedina
Автор

I have this error when I run ./ollama run tinyllama on my phone "Error: [0] server cpu not listed in available servers map[] ?" How can I fix this? Thanks

dennissit
Автор

Thanks bro nice totorial works fine on S23

nickdreck
Автор

Very cool! 👍 Is there a way to run the Ollama Web UI with it?

mikew
Автор

./ollama green folder not show, , and big error please help

banglatechnology
Автор

Error: model requires more system memory (4.6 GiB) than is available (2.4 GiB). I am using Xperia 5 iv and it has 8gb ram.

md.rakibulhaque
Автор

that's all good but what's the beenfit of running it locally on the phone?

UAb-mt
Автор

it works but when i exit and re-enter, it can't.. not sure but it says something about ning? who tf is ning

Kuma_
Автор

getting the "server cpu not listed in available server map" literally scorched the internet
but coudnt find anything for termux

raceup
Автор

hey bro when i run the command ./ollama serve &
It says ./ollama: No such file or directory
Can you figure out what is the problem

abdallahabdoulouafi
Автор

Hello mate, what is the application you use to manage your phone from the computer? 1:16

furkanyasar-jrqg
Автор

when i run "ollama run dolphin-llama3" on termux and i get the error " Error: [0] server cpu not listed in available server map[]" how do i fix it

bit-worm
Автор

Thank you for this, would ollama then serve on android localhost, if so can I build android app to use server?

DeonBands
Автор

Hi, can you please make a video with the installation of gentoo dualboot? preferably with a kde graphical shell. I will be very grateful and I think this will help expand the gento community.

webslime_ceo
Автор

It works but when it try to do it again it says could not connect to ollama app

MuhammedArshad-zb
Автор

What is the size of these LLM's for example gemma 2 billion parameter model ?

_s
Автор

How to run ollama again? Cause it says "No such file or directory"

Dats
Автор

I still get an error "error: could not connect to ollama app, is it running?"

Dats
Автор

I have a very similar speed even on my laptop 😅

adrianmares