Meta Llama 3.1 - Easiest Local Installation - Step-by-Step Testing

preview_player
Показать описание
This video shows how to locally install Meta Llama 3.1 8B model and test it on various benchmarks.

🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:

Coupon code: FahdMirza

#llama3.1 #llama405b

PLEASE FOLLOW ME:

RELATED VIDEOS:

All rights reserved © 2021 Fahd Mirza
Рекомендации по теме
Комментарии
Автор

Thank you so much. It really helped me

wealth_developer_researcher
Автор

Thanks a million for your amazing videos. It is astonishing to see my machine screaming for power and I really love it. Is there a follow up video on this coming shortly? I would like to see how I could implement the setup you demo’ed to interesting frontend applications. Open Web UI is one of them. Or even hosing that openwebui to the internet! Thanks Fahd Mirza for your good work

shawnaaa
Автор

Thanks for the video! Could you please tell us what versions did you use for python, torch, cuda and so on? I couldn't make it work with my GPU even though I have NVIDIA GeForce GTX 1650 Ti, so I installed cpu-only version of Pythorch but the response is extremely slow..

Datascientist
Автор

I am getting an error having to do with disk_offloading. From what I understand, it's caused by not having enough RAM (or GPU VRAM, I don't know - the map_device variable is set to "auto"). Does anyone know how to solve it?

stefanopalliggiano
Автор

Thanks Fahd for this great video
I've installed all the packages and I can run the program without errors (after several fights with the libs conflict) but it takes too much time to print the response, why ? Is it due to the fact that I'm using CPU ? I don't have an NVIDIA card in my laptop. is there a way to make it faster ?

camelion
Автор

I finished download successfully, but now I want to uninstall, how do I do that?

ribcage
Автор

It can't run on local house machines. So what's the point?

keylanoslokj
Автор

Can run Llama 3.1 on Python (VS code or google Collab) ? If i can you tell me how and what to put in terminal for that

yashchaudhari
Автор

Hi. Easiest install???? Ollama works with Llama 3.1. and use page assist Chrome extension for nice UI.

RABRABB
Автор

I will use the web version why I need to spend 6 GB data

SabedTech
Автор

This video does not explain anything?!
👎

AdrianElson-xfiy
Автор

Hi, thanks for your video. I'm sorry, but I find it very difficult to follow. Please describe each step and the tools you are using, as not everyone is a software engineer.

syedejazhaider