Do Game Devs Care About DLSS?

preview_player
Показать описание
Let's talk a little about Nvidia's new 50 series cards and DLSS multi frame generation

0:00 Intro
1:53 Multi Frame Generation
3:52 Other new RTX Features
4:31 Will Game Devs see DLSS as an excuse to not optimize?
8:23 Nvidia's problematic showcase
10:09 Where upscalers serve the most purpose for devs
10:56 Outro

Socials ✨
───────────────

#gamedev #nvidia
Рекомендации по теме
Комментарии
Автор

A problem I often see mentioned is that upscalers tend to make the games look more blurry than in native resolution. Which is really annoying once you start to notice it.

kerbonaut
Автор

3:47 Its gonna be funny to see a lot of new viewers finding this channel and be like "huh what's satisfactory?" and then dig to find out how this guy was such a legend among the community with his partner in crime Jace

rkay.gaming
Автор

I think the fear is mostly rooted in how so many AAA games these days cut a lot of corners to ship a game, generally due to mismanagement. Instead of DLSS + frame gen being treated as an extra boost in performance taking a non-upscaled 60fps to 70 or 80 fps, we are going to get more studios that put less priority on base level optimization and require DLSS to hit the minimum performance requirements.

Mvvement
Автор

The Half Life 2 comparison didn't use the frame generation. It uses the HL2 RTX remake, a completely different game which was specifically remade to work with raytracing, so all new textures and models and whith all sorts of modern rendering features. I'd say the intent of the original game wasn't lost, as the RTX remake was created whith it's own intent and not to replace the original.

_C.A.W.A_
Автор

This is a good angle for content. I like the way you present, maybe because Im already familiar, but all in all, this is cool.

rkay.gaming
Автор

My big worry is that multi frame generation will still feel like you have massive input lag. Seeing a frame every 4ms but only poling input every 33ms is really bad. Playing a game below 30 fps feels terrible, and this won't fix the feeling. Looks nice in fancy marketing materials.

Escher
Автор

I agree but I also think Devs just should not optimize with upscaling in mind to ever improve their games. There is a reason games were better before upscaling. All games should run on the lowest hardware of that gen in 1440p(the new 1080p) at 60fps without upscaling. I personally just won't buy the games that do not. Optimizing a game with a 4090/5090 as a target is stupid.

jmangames
Автор

Some games (Stalker 2) feel like a blessing to have frame-gen on and be able to experience a smooth game, but when I turn off the upscaling on many other games (satisfactory included, but only because Vulkan gives me tons of performance) and suddenly have a much sharper image... it really makes me question why I have to use upscaling and frame-gen. I just dislike the lack of clarity

Pufty
Автор

"I don't understand why people are so upset about DLSS."
Visual smoothness with higher input latency for overpriced GPU is not a good trade off.

doodleEeto
Автор

Personally I won't be picking up 50 series, because they're simply too damn expensive.
5080 for example costs ~$1000, but here in the EU that price will be closer ~1400€ with taxes, if you can even get one because of low stock and scalpers.

jordick
Автор

Personal: Retired software engineer from NYC. Did not grow up with computers, and was the first US class of CS graduates. I have loved computers, tech, and video gaming since university; but I must admit it is going to be a net negative for humanity. (it doesn't have to be, but the dynamics of competition for profit/power and global dominance guarantees this)

The art of writing of efficient and fault free code is a thing of the past both due to economics and the ease of Internet patch releases. The issue impacts game development, but is far larger. In the 1960-1980, compute hardware was tremendous expensive and bugs were very expensive and post production correction was very expensive. Then, it all shifted. Today, hardware is very cheap, and programmers 1990-2023 very expensive. I learned to code when you got fired for wasting a byte in a loop in assembly. Today, all that matters is get it out the door fast. Bugs can always be fixed later, but market share is not something that you can make back as being the second to release.

Finally, the biggest shift for programmers (less so for game developers) ... most programmers in 1980s had a captive corporate user base. In 1990s, programming also became a quest for eyeballs. Yes, the users were finally free being chained to what you wrote. That had massive impact on software development.

mahakleung
Автор

With current hardware there is no reason a game can't keep 100+ FPS if properly done. The fact that we now need to rely on "fake frames" to give the illusion of smooth game play is ridiculous to me. The hardware gets 100 times more powerful than 15 years ago and yet games are running slower now than they did in the past. You used to be able to run mid level settings on games with hardware tiers that were equivalent to todays 5700X, RTX 3070 tier and get minimum 60 fps. Now i install a new game and have to sink all the settings to medium and even some to low with DLSS on performance just to maintain above 60 FPS with hardware that costs 5 times as much as it did 15 years ago. I blame a good portion of this on game engines like Unity and UE. They "sell" their engines trying to show the awesome graphics potential they have and then devs rely on those out of the box solutions that aren't optimized and then release their game thinking "Hopefully DLSS or Frame Generation makes it playable in the future..."

kilrath
Автор

Framegen is interpolation, not “prediction”. You made an analogy to predictive netcode, which tries to predict the future, but we have to remember that framegen requires two frames, neither of which is a “future frame”. What’s really happening is that your game is being lagged by a whole frame in order to deliver smoother updates to you piecemeal. Nvidia has put in a lot of marketing effort to convince people that framegen doesn’t have this latency penalty, but it absolutely does.

HermiHg
Автор

Devs love DLSS. Thats why they don't optimise their games and slap it on as an easy solution to their performance problems

RacerRookie
Автор

a problem with dlss is that some game already use it as a way to make there game run better, i mean the latest monster hunter recommends to run medium dlss.. which is just kind of odd? it really makes it seem unoptimized. if you need to use that at medium.

basseman
Автор

I’m glad we still get to have videos from you! I appreciate this view, and I’m excited to see more from you outside of Satisfactory.

processedsoy
Автор

"I don't understand why people are so upset about DLSS". I didn't either but according to threads:
They offer a 4090 for $1800 and get 23 FPS with no DLSS. With their DLSS 3.0 = get 100 fps on the same game.
Then they offer (as innovate and "next level") 5090 for $2000 and get 29 FPS with no DLSS while burning another 100 watts of power over the 4090. They release DLSS 4.0 = that they wall around the 5 series cards (unavailable for 4 series). DLSS 4.0 = 160 FPS on same test getting about 4 to 5 fake frames for every 1.
One wonders how the 4 series with do if they allowed their software solution to work on 4 series cards. The actual raw performance of the hardware is pretty much the same, but cost more. Gamers can't help but feel like they are being ripped of - getting a software solution to generate fake frames, than actual next gen tech. [edit} - as another person has noted - the bad lag / latency.

stevenpike
Автор

Since my last ATI card, I've been on the Nvidia train. 770, 1070, 2070, and currently a 3080Ti. But I'm probably going to check out the next Gen of both and strongly consider AMD this go around. Not for any other reason than the ability to have choice. I'm not upset with Nvidia over anything, and I'm not smitten with AMD over anything.

I'm just a customer looking for a good deal before the next trade war starts

lapsed
Автор

Really great video, my dude. We already miss you as CSS community manager. But the future is exciting! Good luck!

AriinPHD
Автор

Don't worry your analogy to network prediction was actualy quite relevant !
Something I wanted to add regarding people saying that DLSS is a way for dev to be "lazy" is that there are a lot of people saying the same thing with each UE new tech, last one being megalights. I hear a lot of people saying things like "All of this was already possible if you optimized your game, megalights is just an other way for the dev to avoid optimization etc...". But what those people don't see is that this is a good thing. It allows dev to have optimized performance without being limited by their engine. Unlocking actual artistic freedom.
This shouldn't be a reason not to optimize your game, but this allow artists, that do not necessarly have the knowledge of the engine necessary for the optimizations, to build their maps without constraints and fully express themselves. Which is a great thing.
Same goes for nanite, lumen etc...

And I believe that DLSS falls in that same pot. If it allows game like Cyberpunk 2077 in maxed out settings, with path tracing, on 4K, to run at 240 fps, then I'm all for it.
I understand that it is frustrating, but people have to understand that hardware has its limitations, and that if you want game to look super stunning with path tracing and stuff, you have to make some concessions.

louisveran
join shbcf.ru