8 GB VRAM is a Problem. Is 10G any Better?

preview_player
Показать описание
8 GB of VRAM is a problem with games coming out, isn't even uncommon for games to want about 12. The rtx 3080 released a couple years ago and is still a beast of a card. Can it hold up?

My Spotify:

0:00- The memory problem
1:47- 3080 in RE4
3:04- 3080 in Last of Us
3:38- 3080 in Fortnite
5:13- 3080 in Atomic Heart
5:26- THE IRONY
6:51- VRAM misconceptions
7:39- Who VRAM actually effects
9:36- Does AMD's approach work?
Рекомендации по теме
Комментарии
Автор

If Intel can afford to put 16GB VRAM on a $350 card, I don't want to hear any excuses for Nvidia.

livedreamsg
Автор

I remember when 512MB was enough for gaming.

Thank you for your service, GeForce 8800 GT.

MrSwallows
Автор

I remember choosing the 6800xt 16gb over the 3080 10gb because of this.

Tubes
Автор

I mean, the latest AAA games eventually become affordable Steam Sale games, and so the same problem eventually hits you even if you're not buying at launch.

LucidStrike
Автор

Vram isn't the Problem. Nvidia's Vram is the problem.

Barrydick
Автор

As soon as I heard that the RTX 4060 was gonna release with 8GB of VRAM I Instantly went ahead and purchased the RX 6700 XT with 12 GB of VRAM, and honestly, It is a HUGE game changer at least for me.

madrain
Автор

Working adult here that buys AAA games, so yeah, this affected me. It was sobering to start up Company of Heroes 3 and find I couldn't max it out due to a vram limit on my 2070 Super. Have had this card for 3 years, and I still like it, but yeah, sign of the times. So now it sits in a secondary pc and dropped the cash on a 7900 XT. Problem solved, and now I am back to running everything again and not sweating vram issues on Last of Us, COH 3 etc.

ConfusionDistortion
Автор

I own a 3080 10gb and my favorite game received tons of graphical updates to the point where 10gb isn't enough anymore, i had to cut all settings from ultra to a mix medium/high to get it over 60fps, down from 90 2 years ago.

liberteus
Автор

This is also why I'm going to take a 7900xt over a 4070 ti, 20gbs seems a lot more future proofed then 12

trrgfreddrtgf
Автор

Nvidia: “Here gaming studio, $$ to use more VRAM”

Also Nvidia: “Higher VRAM costs

KobeLoverTatum
Автор

i got a 6800xt for 3070 price, am really happy with it. and the video export times are really good.

(dont support companies, support the better product :))

rajagam
Автор

I just want to say one thing. I got a 1060 6GB in 2016 and spent nearly 7 years with it. Then finally I took my hard earned money and bought a 3080 Ti in November, this black friday. 12GB of VRAM. Then IMMEDIATELY new AAA games became this crazy demanding and devs are saying that 12 GB is minimum. On top of that, NVIDIA is effectly implementing planned obsolescence. The 4070Ti, the superior card to my 3080 Ti, had no evolution in VRAM. It's a 12 GB card. I just gotta say...it hurts to get an upgrade after 6.5 years only to end up immediately becoming the new low tier for this future they speak of. And I do blame NVIDIA. No card above the 3060 should have only 8 GB and the 3080 Ti should have been a 16 or 20 GB card. 3070 owners have all the right in the world to be mad. NVIDIA KNEW this was an issue but they don't care. They still don't.

ngs
Автор

So AAA games are a niche now RE4 sold 5million copies, Elden ring sold 20million Witcher 3 sold 40milion etc I hate it when people misuse Steam Charts to prove their point, Hogwards Legacy is a singleplayer game same as Elden Ring (kinda) but after a month from launch people move on because it is a single player game!!! people finish it and move on. Nvidia giving you 8GB VRAM for the x70 series was a slap in the face for consumers, now they are doing the same with 12GB VRAM. People who bought the RTX 3070 and who will buy the RTX 4070 will want to play the latest AAA games.

sirabee
Автор

When Nvidia released the 3060 with 12 GB of Vram, everything up to the 3070Ti should have also had 12 GB. with the 3080 and 3080Ti getting 16 GB. I just hope this is the straw that costs them enough market share to change their ways, instead of always thinking they can do whatever they want and people will just buy it.

stratuvarious
Автор

Very good observation VEX, The older Pascal cards with 8 gigs of Vram utilize only what features sets they have baked in. The problem now is all these new advanced DX 12 features plus higher resolutions become more taxing on limited Vram buffers in niche situations. There's a car analogy here: When it's fast but runs out of gas? (tiny tank) Or the car can get to sixty really quick but tops out at 80 mph? (low gearing) i really think everyone wants something that performs great and has future potential/practicality, Or value? Hoping their GPU will last a good while for their current pricey investment? Limiting the Ram only limits the possibilities for game developers.

Obie
Автор

Definitely was waiting to see what you have to contribute on the discussion. I definitely see this as a negative for people wanting to play AAA games in 2023 with high details, but it will be a fire sale of great deals and second hand graphic cards for competitive gaming.

franciscoc
Автор

It is very true that most don't actually affected by the vram issue now. The real controversy is Nvidia not providing enough vram while pricing their GPU as a high-end model. No one is complaining the 3050 or the 6600 only gots 8Gb. It's the 3070 and 3080(10Gb) that attract all the attention.

veda
Автор

Man im about to buy my first gpu and all your videos all answering my doubts.

Thank you sm 🤗

bladimirarroyo
Автор

I expected when the games designed for the current gen consoles (Xbox Series, PS5) started releasing on PC, this was gonna start to be a problem. That's why when I was looking to upgrade my 2070, I was looking for something with a minimum of 12GB of Vram. Since I couldn't get a new 3080 (or Ti) for a reasonable price, I went with the RX 6900 XT and it's massive 16 GB of Vram. Since it was $650, It felt like the best price to performance in the price range I was looking at.

stratuvarious
Автор

One little flaw is you did not turn on the setting that shows "memory usage \ process". This one in afterburner/rtss will show the ACTUAL vram usage of the application, and not what is allocated on the whole card.

stephenpourciau