RTX 3060 12GB vs 4090 🤔 Do You Really Need an RTX 4090 for AI?

preview_player
Показать описание
Links referenced in the video:

GPUs Compared:

Hardware for my PC:

Alternative prebuilds to my PC:

Cheapest and PC recommended:

Come join The Learning Journey!

If you found anything helpful, please consider supporting me and the content I am trying to produce!
Рекомендации по теме
Комментарии
Автор

I was going to buy a RTX 3060 with 12GB; and you've convinced me for the price that's a great buy for my AI use.

twoblink
Автор

Thanks for making this, wasn’t able to find any other resources of people running benchmarks for Stable Diffusion on a 4090. You even mentioned a number of AI tools I wasn’t familiar with. Subscribed

OrangeJucee
Автор

Very informative video. The only thing I'd say is, it's worth remembering that you can also rent a system with 24gb of vram running an a5000/3090/4090 for ~50c/hr average cost on a service like vastAI or runpod. For the cost of a 4090 you can get a *lot* of hours of GPU time on one of these services, and scale the cost and how much VRAM you actually need. Like if you just want to offload a 13b LLM to a remote service so you can use your GPU for other stuff at the same time (I do this with Herika AI powered skyrim bot a lot), you can get by with renting a 12gb or 16gb card depending on quantization, so it's even cheaper.

By all means, if you're wealthy and dropping ~1700 is no big deal, or you're going to be doing a lot of training jobs where your system will be actually using the card most of the time, I think investing in a 4090 (or multiple) is worth doing. But if you're just screwing around with this stuff a couple hours a day, something like 2 3060 12gbs or a used 3090 makes a lot more sense financially, and then you can use the money you save to rent GPU time when you need a little more.

stop_tryharding
Автор

The 3060 is a GPU I recommend for folks with an interest in AI but don’t expect to be a full time creator or AI developer or whatever. It’s probably what I’d buy if I were buying from scratch (I bought a 3050 last year when the price went down from horrendous to just bad). Until the 4060ti 16GB is at a reasonable price (likely sometime in the RTX 5000 era), it’s probably the best option.

pokepress
Автор

love to see the comparasion with 4060ti (16gb) vs 4090 wonder how much better a 4060 ti performance over the 3060

jasonshen
Автор

Great video, very informative. I always appreciate research being done outside of gaming situations.

bwowzah
Автор

I think the more interesting question is "which models can you run on 4090 that you cannot run at all with 3060?"

If you don't get paid for running AI tools, paying more than 5x the price for a bit faster AI computations is not typically worth it but if you need CUDA and big model, the real question is 3090 vs 4090 because both have 24 GB VRAM.

And if your current and future AI models fit in 8 GB VRAM, go with RTX 3060 Ti because it's quite a bit faster and not much more expensive than 3060 12 GB.

RTX 4060 Ti 16 GB is also pretty good option even though it's a bit overpriced compared to 3060 but you get 16 GB VRAM which is required to run some models at all.

MikkoRantalainen
Автор

Damn, I just want to generate AI t1tt1es man

hhklbhhksm
Автор

We have upgrade all the Workstations in our studio to the 4090 RTX in few months, between December 2022 and April 2023, the 24GB VRAM are a such great advantage for AI models but also for video editing, VFX, complex 3D scene rendering etc.
More over the computational power of the 4090 is just incredible.
As professionals we pay back the investment on 6-8 months and then the 4090 start to become a Moneymake, . for at least 1 year. That means you get 150-200% return of the initial investment in 18 months. No stock actions can beat this.

We have few 3060 12Gb for display only in few WS as "secondary" GPU.
And I can confirm we have very similar results like yours.
Very nice test. Congratulations

blender_wiki
Автор

would love to see you compare something above the 3060, but still considered budget compared to a 4090. Like a 4060 ti 16gb vs 4090. or maybe a budget faceoff of 3060 12gb vs 4060 ti 16gb

_HashForce
Автор

Can you do the same but 2 RTX 3060s vs 1 4090? I get that it won't double the VRAM to load larger models but I think you'd still get the performance boost from the cores. Am I right?

sonybeta
Автор

i own an rtx 3060 12gb and use the live ai voicechanger and its around a 1 second delay on max chunk and its incredible

TrickTease
Автор

Hello, I'm curious about the factors determining a GPU's performance with the AI voice changer. While I understand that VRAM is pivotal, does the specific model play a significant role? I've noticed you made recommendations for at least a 3060 with its 12 GB of VRAM. However, the 3070, despite being more expensive, comes with only 8 GB of VRAM. Which of the two offers better performance in this context?

ffuwarin
Автор

Your time and effort is greatly appreciated. As a newbie into deep learning and the tools needed for it's implementation I find your video extremely informative. Subscribed.

JD-jdeener
Автор

A 60 minute difference doesn't make it so the 4090 would need to be ~300 for it to be nominal. If you're counting paying an engineer, the 3090 will save one hour each run - that's $100-$300 or more. Which makes the 4090 an incredible value for a business.

adrianmaule
Автор

So in a program like Topaz video AI, you can now run multiple GPU. That means two 3060 will be faster than a single 4090 for tasks with more clips in the que.

Tore_Lund
Автор

The timing of this video is perfect, thanks for making it. For sure getting the 3060!

TantuBeats
Автор

Great video, What would be great is 2 x 3060, same vram. Only not all programs will utilize both

Johan-rmec
Автор

I would love to see the same comparison between a 4060Ti 16 GB and a 4070 Super with 12 GB.
These two GPUs are much closer in price but I just wonder if the memory bandwidth issues of the 4060 would hold it back relative to the 4070 Super.

LouisDuran
Автор

i think you should repeat the experiment with two 3060 12g on the PC and compare it to 4090, and again two 3090 24 g with one 4090, great video btw

fulldivemedia