Build an Ai Server for less than $1k and Run LLM's Locally FREE

preview_player
Показать описание
In this Video:
I’m going to show you how to build a budget AI server for under $1200 that can be used for machine learning tasks and Ai Exploration/Training. Im also going to show you how to run large language models (LLMs) at home for free using LM Studio, a program that runs AI models locally, and can download AI models right from HuggingFace.. We'll walk through the components you need, like a Dell R730 server, Nvidia Tesla P40s, and Power. If you already have a powerful gaming PC or Apple Silicon Mac, you don't need the server

If you enjoy this video, Consider liking and subscribe for more tech tutorials, home lab setups, and self-hosting guides!

---------------------------------------------------------|
Links from the video:

---------------------------------------------------------|
Connect with me:
{Coming Soon}

---------------------------------------------------------|
Music from the video:
End Song generated with Suno.Ai

---------------------------------------------------------|
Timestamps:
00:00 - Intro
00:49 - Parts and Cost
03:45 - Pulling the R730 Out
05:08 - Nvidia Tesla P40 Wiring and fan setup
07:37 - The Bootup
08:00 - Installing NVIDIA drivers
09:00 - Installing LM Studio
11:30 - Testing Models
13:45 - Looking at more Models
14:40 - Outro
Рекомендации по теме
Комментарии
Автор

Good video. I was using ollama, but LM studio is also nice, maybe even better with appimage for Linux.12:55 I am getting about 70 tokens/s on RX6800XT with the same model loaded with default settings.

_sneer_
Автор

Excellent Content dear, You deserve more subscriptions and followers. Can you please explain how can I build a server which can deliver me performance equals to AMD Threadripper Pro 7995x, with 128 GB 6400 Mhz ram, pcie 5.0 NVMe, RTX 4090 with Asus WRX90 Sage Motherboard? I am considering using 4 RTX 3090, Gigabyte MH53-G40-rev-1x Server motherboard, can I build a pc equal same performance as AMD Threadripper Pro 7995x with the hardware that I am planning to purchase

husratmehmood
Автор

Excellent video mate, i am playing around with pytorch on my GTX 1070 (8gb) in a workstation with dual E-5 2680 cpus and 128gb ecc ram at the moment... i am limited with power connectors as i only have a 1 8pin and 1 6pin... i am thinking that my best option may be a 16gb 4060ti or a 12gb 4070 in terms of max power... i was looking at the cards you are using and didnt realise that they didnt use pci 8pins so your vid was v useful there! Have subbed to your channel - cheers!

themarksmith
Автор

Hey I love to research, can we collaborate?

GAMINGDEADCRACKER
Автор

I think there is a ratio between how big a model is and how fast it is. For this reason having lots of vram on slower cards is not really useful for day to day usage. Equally having a fast card with little vram also doesn’t make mush sense.

Did you have to do anything to have the vram of the two cards combined? And do they need to be the same cards?

FuZZbaLLbee