How To Actually Use DLSS 3 Frame Generation!

preview_player
Показать описание
In this video, we showcase the results of Nvidia's DLSS 3 frame generation technology in several popular games including Cyberpunk 2077 and Ratchet and Clank: Rift Apart. DLSS uses advanced AI algorithms to enhance image quality and performance, and the results are impressive. Watch the video to see how DLSS 3 improves each game's frame rates with minimal impact to visuals.

Worried about additional input latency, or input lag when DLSS 3 Frame Generation is enabled? We also show you system latency with and without DLSS 3 Frame Generation.

Find out how to correctly use DLSS 3 Frame Generation!

System Specs:

Nvidia GeForce RTX 4080 (MSI Gaming X Trio)
Intel 12700K CPU @ Stock
32GB DDR4 4000Mhz CL18 Memory
Asus TUF Gaming Z690 D4 Motherboard
Corsair RM850 Power Supply
Resizable BAR Enabled

All metrics captured using MSI Afterburner in conjunction with RTSS, as well as GeForce Experience for latency metrics.

Timestamps:

0:00 - Intro
0:25 - DLSS 2 vs DLSS 3
01:57 - Cyberpunk 2077 Input Latency Test
4:45 - Spider-Man Remastered Input Latency Test + CPU Bound Scenario
7:40 - Ratchet and Clank Input Latency Test
10:53 - When to Actually use Frame Generation
12:47 - Extreme Power Savings Diablo IV
Рекомендации по теме
Комментарии
Автор

People don't hate framegen, they hate the pricing and specs of Nvidia latest products and attempts to sell DLSS3 and framegen as actual performance.

PotatMasterRace
Автор

Anti-lag and it's counterpart works by adjusting frame timings to line up with the CPU (along with a sort of inverse vsync) and reduces the buffer size so you have less pre-rendered frames.

hakarthemage
Автор

The problem is not that they are using frame generation to sell new cards, they are using frame generation as an excuse to bump up the price.

edwardecl
Автор

why hate a new feature? it IS more frames - fake or not. and the fact is, you really notice the extra FPS but hardly notice at all the latency. Obviously just not something you'd use on esports titles for multiple reasons, including the fact that FPS is high in those games anyway

GregoryShtevensh
Автор

I have a Lenovo Legion 5i Pro incoming. Thank you for the detailed, real world info and examples. This totally makes sense and looks like a game changer, especially for single player games.

Brianybug
Автор

Make sure that you've turned off ALL forms of Mouse Acceleration. One checkbox exists in the Windows Control Panel: Mouse Settings and there may also be a mouse acceleration setting in your gaming mouse driver software, sometimes LG software has it's own for example. Most people aren't even aware that they've been using mouse accel until they actually turn it off and feel how their own mouse pointer accuracy seems off... until they become re-accustomed to not having it on.

bryanwashere
Автор

a 10ms increase in latency from just dlss to frame gen is miniscule. and with dlss completely off, you get MORE latency than dlss+frame gen? why are people complaining about the smallest things

theorphanobliterator
Автор

In Cyberpunk it feels awful unless you already have high base framerate without framegen, but in Rift Apart it feels crazy good, no input lag whatsoever, super high freames even with RT On and everything maxxed out. I play with DLAA and FrameGen and the game looks beautiful.

Kuraiser
Автор

I could feel it right away, something felt off with mouse movement. Thanks for this breakdown, it's great to know the base framerate affects the latency post generation gives me more to work with getting it down. Excellent video

jakemee
Автор

I'm late to the party here, but this is a great explanation to the whole DLSS 3 Fiasco. Yoy have a calm explanation to counter some arguments, was un biased about it, and even presented some of your evidence.

I will however say, I shared with my brother a system with a 7900 XTX, but decided to build my own as well, with a 4080 super that finally arrived yesterday. My last NVIDIA GPU i used on on a family's and also another personally mine, all in one pc's. Family's woth GTX 1050 mobile and mine with a 930mx. Neither playable on anything, even more so on the 930mx.

Looking at both of the drover settings software, it reminds me of the differences I watched within these technologies. I would say and that many others more informed on this Anti-Lag would say it's more if a competitor to NVIDIA's Low Latency Mode within the control panel. They both work by Dynamicallh limiting the number of frames the CPU can produce to help knock out the render queue, thus creating less worload on the GPU.

NVIDIA Reflex, and Anti-Lag+ (When ever that will come back), took this a step further by timing this better with added game data and also timing thongs within the game's engine.

technologicalelite
Автор

Slight correction.... We actually don't know if DLSS is AI-based. This is what nvidia has said it is. However since it's closed source and since nvidias marketing work a lot with "half-truths" I wouldn't take it on face value. First of all, we don't do AI, nobody does AI... what we do is machine learning, and in that case, the algorithm needs to learn over time, which it doesn't. There was a huge paradigm shift from DLSS1 (which was AI generated), But the results were REALLY bad and the tech quickly became a laughing stock. Nvidia went back to the drawing board, kept the name "Deep Learning Super Sampling" but ultimately DLSS2 is a temporal upscaler. It does use tensor cores for acceleration. Which sure, nvidia market as "AI-cores", But in reality these cores are specifically designed for matrix calculations... that's not the same thing as "AI". This allow RTX APUs to accelerate parallelized workflows, which also accelerate machine learning, but in order to receive blistering fast performance, it calculate several layers of temporal data to reconstruct each pixel on screen. There is no image recognition, there's no learning, there's no AI. It's just an advanced, but accelerated temporal upscaler.

PixelShade
Автор

The people with a problem with frame gen probably either don’t understand how it works, they’re trying to run it on underpowered hardware, or they can’t afford a graphics card that is capable of running it and they’re hating on it because some other YouTubers told them it was bad.

chincemagnet
Автор

Frame gen is awesome. You probably need a 4090 and a 120hz or higher display to really take full advantage of it. If you can’t make it work correctly, you’re probably lacking common sense or decent hardware. It’s a massive game changer in Jedi Survivor, Plague Tale Requiem, and Witcher 3. What I have found is that it’s best used in conjunction with DLSS and if you have the overhead, DSR. Dead Space Remake, doesn’t have frame gen, but the ultimate way to play that game is with a 4090 using DSR for 7K resolution with DLSS quality. Looks way better than native. With frame Gen, you could push it even harder or just make it smoother.

chincemagnet
Автор

As someone who severely despises input latency, and at all times would prefer less than 20ms, if you can't feel a 10ms difference in input latency then it's definitely for you. If you can, like me however, it's snake oil. Adding input latency is absurd. I don't care how the game "looks" I care how it feels.

pat
Автор

most importantly dlss3 is now open source so you can have dlss3 in games regardless of whether the developer includes support or not. Starfield with dlss3 is going to be amazing.

AngelTorres
Автор

It's really about what you are able to perceieve, I can absolutley feel the input latancy in Cyberpunk with FrameGen enabled, some may not. Also tuning graphic settings will help reduce ghosting and artifacting. Cyberpunk is a complete blurrfest with DLSS 3 when higher RT setting are enabled especially pathtracing but looks and feels great with DLSS 2 enabled same settings. Frame limiter and Reflex is a must with FrameGen. (Cyberpunk player and OLED+4090 user here)

jking
Автор

The problem with frame generation is that the players who need it can't use it because it requires a high framerate in first place, also if I'm not buying a 600€+ GPU just to have to use upscaling and frame generation, those tricks should be for 200€ GPUs

iamspencerx
Автор

Definitely, most people find to use FG is poor experience, because they tried to use this tech like magic - "ok ive got 20fps in Cyberpunk, lets use FG! Hmmm ive got 60fps but its feel poor!". I've sey to people like this, its not magic, its tech. This need to use right. You need turn settings in to the game to see 40-50fps in your monitor before you turn on FG, to fell fine with input latency. I find this tech is fantastic! I can play in Alan Wake 2 with path traicing with 70-80fps. Paradoxically but if you use your brain (with how work input latency) you really see Magic. And if we are returning into 2019, people hating first version of DLSS too, but here we are DLSS work perfectly in 2023 and not have a many haters. P.. S. Sorry for my broken language (i really far to speak and write natively) i hope your understand somfing what i write up.

Johan-jhbl
Автор

Hey there, great video 🙂 I have a question - do you know how to fix screen tearing while using frame gen? I tend to enable Vsync in NVCP, but that adds input lag which we don't want. It's just surprising I see tearing when I'm on a GSYNC compatible panel. 🤔

Chasm
Автор

How do you get your overlay from Geforce Experience to show. I use it as well, but it never shows up in my recordings.

Reaper