Nvidia is Just Trolling Gamers Now! - RTX 4050 Already Obsolete

preview_player
Показать описание
We now have some information surrounding Nvidia’s entry level RTX 40 series graphics card the RTX 4050. This GPU will allegedly use the same configuration as it’s laptop variant just with higher clock speeds. It’ll utilize a mere 6GB memory buffer and come with a 96-bit bus. This makes the upcoming “budget” Ada Lovelace GPU, DOA. There’s no way around it, this GPU when it comes out will be struggling with the latest triple A titles at 1080P. The fact that Nvidia is willing to release a GPU which will cost over $300 in 2023 and it won’t even allow you to comfortably game at 1080P is just straight up sad at this point.

SOURCES

Use my Amazon affiliate Links to support the channel:

Watch my RTX 4090 undervolting & powerlimiting video here:

RTX 4090 vs RTX 3090 MEGA Benchmark 4K 42 GAMES TESTED - The Best Gen-on-Gen Upgrade?
Check out E-cores gaming benchmarks tested on my tuned 13700K

TEST SPECS
CPU: i7-13700K 5.5GHz P-Cores, 4.5GHz E-Cores, 4.9GHz RING
CPU Cooler: Arctic Liquid Freezer II 360 AIO
RAM: Patriot Viper Venom RGB 6800MT/s C34 manual tuned timings
Motherboard: MSI Z790 Carbon Wifi
GPU: MSI RTX 4090 Gaming X Trio
SSD: Corsair MP600 Pro 4TB
PSU: EVGA 1000 G3
OS: Windows 11 Pro
Nvidia Driver Package: 528.49

Interested in picking up a new GPU? You can use my Amazon affiliate link below

Follow me on Twitter
CPU: AMD Ryzen 9 5900X
CPU Cooler: Thermaltake TH360 SNOW
RAM: G Skill Trident Z 32GB(4x8GB) 3600MHz CL14
Motherboard: Gigabyte X570 Aorus Master
Graphics Card: ASUS ROG STRIX RTX 3090
SSD: Samsung 970 EVO Plus 1TB
CASE: Thermaltake View 51 Snow
Monitor: BenQ XL2730Z 144Hz 1440P
OS: Windows 10 Pro 64-bit

Does Your CPU Matter for 4K Gaming - Retesting My RTX 4090 With Intels i7-13700KF 21 Games Tested

MSI Z790 Carbon Wifi overview & Test Bench upgrade

Watch my MSI RTX 4090 Gaming X Trio Review Here

#RTX4060 #RTX4050 #Nvidia #Graphicscards
Рекомендации по теме
Комментарии
Автор

A little trolling from Nvidia a day, keeps the buyers away.

clinged
Автор

The saddest thing is that despite the higher prices, people are still gonna buy whatever nvidia throws at them.

Edit a few weeks after the 4070 release: Im glad to know I was very wrong

gerardomarca
Автор

if the 4050 ends up being 350 you might as well get the A770 with 16gb vram.

shadowminor
Автор

This is why DLSS 3 isn't being released for the NVIDIA 30 series, the 40 series hardware other than the 4090 is a joke.

dh
Автор

cant wait for intel's battlemage cards. Honestly the first time in 3 years im getting excited by new gpu's. Never expected the excitement to come from intel tho lol

rwottevanger
Автор

Nvidia is digging their own grave right now. The first thing average consumers see when looking up "Nvidia 40 series" are reviews saying how shit they are.

tony_T_
Автор

RTX 4050 should be the new GTX 1650 super 200 max at a push.

roythunderplump
Автор

I remember when Nvidia had some amazing budget cards 750 TI and the 1050 TI but this is sad. Wouldn’t be surprised if it was worse than the 3050 in instances with higher VRAM

rambow
Автор

If you get 4050, you need to upgrade before it even comes in the mail.
Planned obsolecence and market manipulation is this gen

redinthesky
Автор

Honestly if i was in this market of gpus, i would just bite the bullet and get intel arc a750 for ~250$. This is getting ridiculous

htzzYT
Автор

Not too long ago, the 50 iteration for pascal was like $150 CAD. Selling a 50 series for $300+ USD is literal insanity.

craigdaurizio
Автор

They must be counting DLSS fake frames as the real ones when calculating the prices.

taotie
Автор

This is what I'm talking about. This would be a Great Time for AMD to flood the market with GPUs undercutting Nvidia at every Price point. But they just can never hit the mark man. WTF!

NBWDOUGHBOY
Автор

Amd: So i can increase the price of my gpu too?

raychii
Автор

Unless they do some magic with the driver, 6GB will add stuttering over the stuttering of current games.

RobertFromEarth
Автор

I think they reduced the Vram to save money and then charge everyone extra to help pay for their new frame generation tech I mean really a 40 series card with 6 gig of ram these days no way 12 gig minimum

jamesmackinlay
Автор

Gamers will definitely not forget the last 2-3 years. I will stay away from that company in the future and I'm pretty sure many other people too. In a few years I'm gonna build a new system with a Intel GPU in it. Eat that Nvidia.

Sunandor
Автор

$150 would have been an appropriate price for this card.

parzival
Автор

I remember back in the day picking up a 1050ti for a little over $100 new, amazing how things have changed in the last few years.

jacksongunner
Автор

If you can't afford, or simply don't want to pay $1000+ for a GPU, buying *ANYTHING* this gen makes no sense. Used 12Gb 3080's and 3080Ti's are already dropping in price, and even they aren't looking like they will be very good for anything over 1080P gaming. 16GB seems like the minimum, and when the 6800 and 6800XT exist for $400-$550 used and $450-$550 new, it seems like they are the ones to go with. If you need more _hootspa, _ then a used 3090 or 3090i for $600-$700 appears to be great choices along with the 6900 and 6950XT which run around $650 new.

New games are being coded for a 16GB memory *POOL, * especially games with the PS5 in mind. The PS5, of course, has a 16GB pool of gddr6 and 512MB of ddr4 for background tasks. That means that if your GPU doesn't have 16GB on board, then you may very well be screwed in a boatload of upcoming titles not made for GamePass/Xbox exclusive. This also isn't something that can be easily patched as it takes quite awhile to code in a remaster or port *IF* you start at the beginning of the process. Fixing it after the fact is exponentially more difficult and something most companies are not willing to put in the resources for. Heck, The Witcher 3 has spent months delivering what, 4, 5 patches to address the issue and are just making things worse at this point.

Now if you play at 1080P/High without raytracing then you *probably* have nothing to worry about with an 8GB card. Even 1440P/High is _usually_ OK with AMD do to SAM, which is far superior to Nvidia's ReBar which they really don't seem to give a crap about. Why would they, it would make their cards last longer, which they are trying to avoid by gimping cards via the RAM and memory BUS to force upgrade *WAY* before people would normally need to.

A 3070 which is still selling for over $500 USD is already obsolete for anything besides 1080P gaming in modern titles, and will probably drop to a 1080P/medium settings card within the year. The 10GB 3080? That's probably in the same boat, though 1080P/High *_should_* still be OK. A $700+ card to play 1080P/High? WTF Nvidia? 12GB? Still not even close to enough for 4K locked 30 without massive Fine for 1440P/High...kinda...as long as you use DLSS so you're rendering at 1080P...In that case, you may as well stick with a 6700XT for $350'ish.

This is absolute insanity. This generation of graphics cards is complete garbage, and the last gen that everyone thought was such a great deal really wasn't. *Most* people who buy a $300 card expect it to last at least 1.5 generations, preferably 2. Heck, the 1050Ti lasted 2 gens as a 1080P card! The 1060 GB and both RX 580's lasted 3 and *4 GENS* respectively, and we are talking cards that ran $175-$230 new...on day one of release. Now a $700 card looks like it lasted one gen at the resolution it was promoted for. Same with the 3080Ti, 3070Ti, 3070, 3060Ti, 3060 and 3050. What do they all have in common? Way too little RAM for their MSRP, never mind the prices they command/ed.

Now we have an $800 dollar card that was advertised as a 4K card breaking at 1440P/high in the last two big AAA release games. Both remasters/remakes of old games? Wtf? The last of Us looks really, really good with RT on, but without it the game just looks, very good. Not incredible. It looks like something a 5700XT or 3070 would run at 1440P/High at 60-75 FPS. Except it's not. It's a game that crashes the 4070Ti at those settings.

Lol, this is so, so not good. I'm not just bitching bc I have a 3080. I only paid $525 for it so I didn't get completely screwed, but it's still a lot to pay for a 1080P card. Now with all of the work that Nvidia has put in with Epic and Unreal there is no way that they did not know this was going to be a huge problem, so I have to think that htis was intentional. Now that makes it even more dumbfounding that they decided to only put 12GB of RAM on a 192Bit BUS on the 4070Ti.

If sales weren't bad before, and they are since Nvidia is showing it's assets at 98% which means the majority of the 660+ Billion USD evaluation of their company is fricken *INVENTORY, * I can't see this doing anything to boost their chances of maintaining their huge lead. Hopefully AMD can step up and drop it to a...really big market share dominance?

Like I said, if I didn't want to spend on a 4080/7900XTX or the 4090, it seems like the only viable options are used regardless of your price point. If there was ever a generation to skip the new cards it is this one. We thought Turing was bad? Whelp this pretty much makes Turing look like Pascal, 8800GT times, or the 7970 eras. Besides the top of the line *EVERY* card last gen, and apparently this gen (for Nvidia) now looks like a *terrible* purchase if you buy new.

I'll wrap here since it'd just be me repeating myself at this point, but please folks, this isn't just about not rewarding companies for screwing us, it's about not wasting your money on something that you will most likely not be happy with; It's about not wasting your money on something that cannot do what you need it to.

crzyces