8GB GPUs are Officially Dead........ (and intel killed them)

preview_player
Показать описание
soooooo...... the Intel Arc B580 is here and I'm late lol. I really like it, if you wanna pick one up for yourself here is an affiliate link (i get some money at no cost to you, thanks):

Good luck finding these, because people are buying them up. The GPU market has been extremely stale especially in the "entry-level". Nvidia and AMD have been selling cards with 8GB of VRAM and SOOOO many games are easily going over that right now.... think about a year down the line?? The B580's mission is to fix that with 12GB of memory at an affordable price, and I like that.

0:00- 8GB in 2024 is crazy
2:39- 1080p aint really the play anymore
3:38- introducing the Intel Arc B580
4:10- Retribution vs RTX 4060
13:20- Where the B580 falls short
17:27- B580 wins hands-down tho
18:13- This Card is an OVERCLOCKING DEMON
20:45- Destroys the RX 6600
22:03- How does RX 6750 XT hold up?
24:25- B580 is the clear choice
26:45- 100K, pog.... also update!
Рекомендации по теме
Комментарии
Автор

Don’t forget it’s highly rumored that the 5060 will have 8gb of vram

alexmeek
Автор

Imagine telling somebody 5 years ago that intel GPUs and Ryzen CPUs are the best way to game on PC XD

schnitzel_enjoyer
Автор

Westerners complaining about paying 200-300 USD for GPUS when I have to pay triple that price whilst having quarter of their salary... us 3rd worlders get bullied too much.

LukaMamukelashvili
Автор

Should also note the B580 is on launch drivers as it just came out. The RTX4060 has had 18months of post launch driver developement.

kaseyboles
Автор

It's just a matter of time until someone gonna defend 8gb by saying "8gb for Nvidia is the same as 12gb for AMD/Intel", they are the apple of gpu market.

tomthomas
Автор

I hope they sell a TON of B580's.
We need competition in the GPU market so badly.

jazeenharal
Автор

Nvidia is like “hold ma beer, gonna launch another 400 usd 8gb card”

krugerblue
Автор

This is how it should have been:
- 5060 12GB
- 5070 16GB
- 5080 20GB
- 5090 32GB

meerminman
Автор

Not watching this, the real issue is lack of optimization on developers part. They offload the cost of this to the consumer by putting the extra load on the GPU. Games now look worse with better tech than several from the early-mid 2010 era. This is the entire controversy surrounding new game engines and apparent lack of progress and performance for the leaps in technology. It is laziness for the sake of capital gain

DiscDoctore
Автор

People with rtx 3060 12GB: They called me a madman...

higorss
Автор

Nvidia: Guess what suckers, 5060 is also 8gb vram

hossein
Автор

Thanks Vex for the great content this year ✊. May you and your family have a great Christmas 🎄🎅, an awesome new year and a terrific 2025. Cheers 🍻 and looking forward to more awesome content in 2025.

michaelthompson
Автор

So basically to sum this up for everyone who doesn't wanna watch the video....

the Nvidia 4060 is you, and the Intel B580 is the guy she told you not to worry about.

CaH
Автор

Wow no one has mentioned the overclocking in all the reviews I've seen. That's quite a significant uplift in performance. B580 the value king.

fttmvp
Автор

Yet still feels like yesterday when R9 390(X) launched and I was like "whoa, 8GB, who even needs that much of VRAM"

How time flies

keulloe
Автор

The 5070 coming out with 12gb is criminal!

FurBurger
Автор

8 gigs is not enough anymore, me with 6 gigs 💀

ishiyu
Автор

Modern PC gaming is so unbelievably bad it’s actually kind of insane to think about. I’ll keep playing games from 2016 back.

TempleofRain
Автор

Got a used RX6800 last year for less...That's how bad the new GPU situation is when the B580 looks like great value. I hope this pushes all 8gb GPUs under $200USD

UncannySense
Автор

At the end of the day, IMO as a 20+ year graphics programmer, these games should be abiding by the user's graphics quality setting and dynamically scaling the LOD according to how much VRAM their GPU has. This means that everyone can have ultra-quality textures up-close, but the quality drop-off with distance will be greater on GPUs with less VRAM. There could also very well be an LOD balance where an end-user can sacrifice the up-close quality of material textures for more consistent texel-to-pixel density across the scene. There definitely should not be shipped games where they just blindly use up all the VRAM and then start shuttling resources from system RAM to render frames, which is at least better than over-committing to VRAM and just causing a system crash, but the framerate should never be able to drop to a slideshow as a result of resources overflowing out of VRAM into system RAM. The engine should be doing everything it can to keep performance up - otherwise what's the point of things like dynamic resolution scaling? What's the priority here, graphics or performance? They need to make up their mind.

CharlesVanNoland
welcome to shbcf.ru