Why Benchmarking CPUs with the GTX 1080 Ti is NOT STUPID!

preview_player
Показать описание
Check prices now:

Read the written version of this editorial on TechSpot:

Support us on Patreon

Why Benchmarking CPUs with the GTX 1080 Ti is NOT STUPID!

Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed

Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links

FOLLOW US IN THESE PLACES FOR UPDATES

Рекомендации по теме
Комментарии
Автор

Going over 60 frames per second on a 60hz display is still beneficial because it reduces input lag. This is due to the fact that, even though not all the frames are being displayed, control inputs are still being processed at whatever frame rate the GPU is able to achieve.

RuinerXL
Автор

Great explanation for new comers to the world of PC building. Whats most amazing is how easy it is to show a CPU/GPU bottleneck in most modern games. Great work, keep it up!

Skwisgar
Автор

but we want an extreme benchmark battle to the death!!!

ayushdabral
Автор

It's simple to understand: With a 1080 Ti you see how many FPS your CPU can handle, then you check how much your GPU of choice handles, if the number from the CPU is higher than the one from the GPU, you're in good shape, and if it's considerably higher, you might have a GPU upgrade in the future without having to change the CPU.

mateuscampello
Автор

I was one of the people asking for a lower end gpu paired with a similarly priced cpu. This video is exactly what I wanted and needed. Thanks, Steve for the excellent content!

unit
Автор

I agree will all your points, even though I am a proponent of testing in its' own cost range, your method is needed to determine absolute GPU differences. Using the mid range cards is only best to predict actual use case best pairing combos and to determine where the limiting factor will present itself. Great explanation, this needs to be posted on every discord forum.

StalewindFarto
Автор

This really needs an explaination? I don't even know what to say.

Edit: I should clarify, I'm not criticizing the video, instread I'm both suprised and slightly disappointed that this video needs to exist in the first place.

stargazer
Автор

The people being mislead are usually just uninformed... benchmarks with a 1080ti(shows potential of the CPU) vs your average gpu bottleneck systems (which is real world application)
People mistake the potential with it being the same with every GPU... so if the 1080ti with 8700k was ahead by 15 fps... some people will think when they buy a 1050 they think that will see 15fps difference compared to when using a ryzen system... when i reality there wont be any difference..

So these fanboys will go spreading lies about how their gtx 1050 will get way more frames with intel... benchmark videos spawn this type of stupidity cause most people will skip to the graphs and not listen to the testing methodology or any background info.

Thesinistereyes
Автор

Do people realy need you to explain them this??? Who can be so dumb to not understand something so basic?

josejuanandrade
Автор

Its so addicting to watch these kind of videos.

KoRnTwIsT
Автор

Thank youuuu! People think only cpu bottleneck exists. Even 1080ti can bottleneck at 1080p.

CasualGamers
Автор

I'm glad you went over the frame rates of competitive graphic settings. I think that is missing in a lot of benchmarks out there. It's also crazy how turning off shadows in a game can lead to a massive jump in frame rates and can help you see people easier in different environments (for example: Wailing Woods in Fortnite).

QuickshotGaming
Автор

Since the video is discussing CPU benchmarks, I thought it will be a good time to voice a concern I have. The comparison charts for those types of tests are usually missing something that I view as important. And that is the average of the total CPU utilization during the test.

When talking about future proofing, the CPU utilization is important, since the CPU that have a higher frame rate today but fully utilized, it is very likely to under preform in the future since future games will add more workload on the CPU (such as more NPCs or better AIs).

So for future proofing, a CPU that has a lower frame rate but is only 50% utilized is the better option than a fully utilized CPU that gives few frames more.

I do love the benchmarks videos that you guys make (and the tech news videos), and I'll honestly appreciate it if you will also add the CPU utilization when doing a CPU benchmark since it will make it easier to understand what is happening.

strangerontheroad
Автор

Congrats, man. I hope this can get a lot of things outta the way for your future videos. Just keep the link for this one at hand and when someone trash you, just link it to him and that's it.

rekoj
Автор

To be fair, it is correct... but one has to also remember that GPU software affects performance with different CPU types. I still will remember how Ryzen and Nvidia cant do 100 fps in some old games, but Ryzen and AMD is in the 200 fps range or intel with Nvidia.

One must also always mention, because gamers are kinda not very bright about it, that future games should behave differently, as they have no choice. I am certain a 8600K is better in almost all games to a 2600X. I am also fairly certain its an inferior long term gaming CPU without an upgrade path.

CharcharoExplorer
Автор

Hey Steve, great video as always. A bit surprised that people are asking about this - but good job explaining for people who might not fully understand how component bottlenecks work

robertwave
Автор

Thanks for the explanation, as I was one of those that asked myself. Great content !!!

nddbulldog
Автор

Great content as always Steve. Nothing I didn’t already know. But at least I can refer people to a video to explain when asked the question.I only honest critique is that it would’ve been nice to include results of the nine series versus the 10 series so everyone can collate and clearly see the progression in GPU performance. I’m Personally hoping that the next series of GPUs makes 1440 A cpu bound resolution

oldmanian
Автор

Just look at how back in 2012 there hardly was a difference between the i5 2400 and the i7 2600k in gaming and everyone said just to buy an i5. Nowadays the i7's are still rock solid in gaming whereas the i5's are definitely starting to lag behind. Just because its equal now, does not mean it will be in 2/3 years time when ALL gpu's are around 1080ti level or better.

HELZ
Автор

Not to mention people upgrade their GPU a lot more often than their CPU, so it make a lot of sense to look into the future (GTX 1080Ti = GTX 1170 = GTX 1250Ti)

crusadehs