Why Nvidia Is Stuck with Tensor/RT till 2021

preview_player
Показать описание
Some people keep complaining when I show how small Navi is Vs Turing - they say "But da RT and Tensor Corez take up space!" This video breaks down why Nvidia cannot remove them anytime soon! Timestamps:
1) Intro 0:07
2) It’s Time to STOP the Dumb Tensor Comments 0:52
3) Again, why RDNA is impressive 1:42
4) Analysis of Turing Without Tensor and RT Cores 3:50
5) Proving Turing would still be HUGE without Tensor/RT 6:08
6) Nvidia won't go bigger than GTX 1680 with the 1600 series 7:45
7) The Real Reason Nvidia won’t remove Tensor/RT 8:48
8) Summary of all Main Points 11:28

Broken Silicon [A PC Hardware Podcast] is available on ALL platforms. Raw RSS:

All Music by SAHARA:

LINKS!!!!!!:
Рекомендации по теме
Комментарии
Автор

I got a Radeon Image Sharpening Ad at the start. AMD sure knows how to create attention grabbing advertisements, because they are relaxing to watch.

sulphurous
Автор

'It just works and we're gonna remove it once you got the taste! That's how good it is.'

docbogus
Автор

Polaris was the efficiency king till AMD pushed the clocks too far.

EDIT: I am by no means suggesting Polaris wasn't good. It was good compared to Maxwell but not Pascal. If Vega wasn't late, then things would have been different.
I will by no means blame Radeon for screwing up but let's be real... Raja Koduri frigged over the GPU division with Vega. I think if Vega had lower clocks with the amount of CU's it had, it would have been much more efficient. 64 @ 295W TDP is really bad.
And 56@ 210W TDP (note this card always went over that).

slimstrait
Автор

Didn't Jensen Huang say Nvidia spent like 10 years to develop their raytracing solution? There is no way they would drop rtx and rt anytime soon.

They may change their approach on how to do it but they wont drop it (because who knows, maybe AMD finds a better and more efficient way).

Raytracing has always been the goal. It is here to stay.

Nvidia will never again release a high end card without rtx (at least it very unlikely).

jonson
Автор

Honestly, if consoles are using AMD development kit and port games to PC, I don't see how devs are going to waste time with this... or gameworks.

evalangley
Автор

If AMD's RT patent is true, RT cores are not required and cuda cores could do it just as well with assistance from extra cores on the CPU.


RT cores would be redundant and never used because a single API would be used by game developers for both manufacturers cards.


If that is true, Nvidia would have backed the wrong horse with RT cores and be forced into a new RT coreless arch.

The-OGRE
Автор

Nvidia fanboy: Useless features at exorbitant prices? Take my money!

Post RDNA era nVidia fanboy: ohh well nVidia can just chop off it's most important development to compete. We don't really need it.

Doompro
Автор

Turing is another Nvidia broken architecture. Just like Fermi was.

While Zen was AMD's Sandy Bridge moment, RDNA feels like TeraScale reborn.

By the way, you became celebrity on my group.

The_Nihl
Автор

​I said it when Nvidia came out with the RTX series and I'll say it again, the world is not ready for ray tracing in games. It's pointless right now and too expensive. To me that makes the die size argument in my mind moot.

However, since it's an argument that some of the fanbois love trying to bring up, but you are right. It makes absolutely no sense what so ever for Nvidia to drop a feature that they have been touting for the last 11 months just to drop it. Especially not now with Sony, Microsoft and AMD all announcing that the the next gen super consoles are going to be powered by AMD tech with ray tracing capabilities. Ray tracing has a place in the world, the world just was not ready when Nvidia forced it upon us.

What would compound the situation for Nvidia if they did drop RT/Tensor and AMD's ray tracing methods not only succeed, but do so without the performance hit that you get with Nvidia's current implementation. How stupid would Nvidia look to even their most hard core fans and, more importantly to their most important source of income, their investors if they were to do this? Nvidia needs to do the exact opposite of drop RT/Tensor and actually invest every last available dollar into research and development of ray tracing to improve the performance hit for that feature. Only in that way will Nvidia actually compete. That's the ONLY thing they can do from here.

emryssambrosias
Автор

Heard about you from not an apple fan and I like your content.


Subscribed and Liked.

Tekagi
Автор

I was quite disappointed that Turing didn't break the 1000mm2 barrier.

If they did it would of been comically meme level stuff.

S
Автор

It's nice to see a youtuber go into speculation as far as you do because there's so much that needs to be said about stuff that's just so simply obvious that most channels won't even touch with a 10 ft pole because they don't want to risk being wrong about something even though they still absolutely get stuff wrong all the time anyway.

mattroy
Автор

You are talking to fanboys, they will never listen to you. Reason is out of this world for them... >:/

evalangley
Автор

NVIDIA is soon to have a quite rocky near future. I'm guessing there will be stiff competition and/or dominance in the market from AMD. Let's see what Intel comes up with, though.

ograin
Автор

Was worth subscribing you, very good explanations, deeper architecture information and all this put in 20 Minutes max videos!

kyoudaiken
Автор

I feel like people claiming that the RT/Tensor cores take up a huge amount of die space at least partially comes from the a slide in one of NVidia's presentations, where they showed a picture of a random die sectioned off, with shader/compute taking up half and RT/tensor cores taking up the other half.

bahkified
Автор

i bet nvidia is stressing big time as they foresee a similarity as happened with ryzen to intel, but with nvidia thistime

bit-meiko
Автор

Nvidias strategy is take baby steps, milk the consumers, and stay just a bit above AMD in performance, they know the cards will sell regardless, and a good amount of consumers will just buy the next underwhelming series of cards.
#MaxProfit

BM
Автор

I've been subbed and on the notification squad since you had about 900subs! You do phenomenal work.

skoopsro
Автор

Removing tensor and RT cores is easy. You just uncheck those two options in the menu before hitting the "make wafer" button.

jujanhunter