You mentioned Alan Wake and Cyberpunk - Frame generation is good for slower paced adventure games as you don’t need fast responses. The faster movement involved in a game the worse frame generation is for it. Darktide is very fast paced movement, not a slow shooter like Helldivers 2.
If want the absolute best input latency reduction.
This may sound strange. But just do it for couple of games. Now play Zealot class. You will see the advantage it puts you in. Purely as example test.
vsync off nvidia driver level and vsync off in game
Don’t use gsync
DLSS performance mode. Ultra performance is too much blur.
Set no FPS cap. Doesn’t matter what refresh rate monitor there is a input latency benefit going over it
Disable frame generation
Enable reflex.
Texture quality does not matter to FPS. Only determines VRAM usage. Set to high.
Put all other graphics to lowest
Now you got best input latency reduction. You’ll get screen tearing without vsync/gsync, but you have lessened the FPS drops when graphics set to lowest to bring average up.
Ignore any software overlay metrics you’re using. Just see how it “feels” yourself from mouse movement to response. Try it with zealot class.
The problem with software overlay metrics is that it will say like 30ms latency to 60ms. Humans can’t detect 30ms extra. But people see it as a number, when actually it is double the latency.
This is why hardcore people obsess with RAM timings trying to go from 80ns to 60ns latency. It isn’t about the number, it is the fractional difference.
The choice of balance is yours. Above is extreme settings as if you’re playing competitively. But try it once or twice and you’ll see what I mean the advantage it gives you. Then you find middle ground between visual acceptance vs acceptable FPS/input latency tweaking.
I completely understand what you mean! I don’t just go by the numbers either. I have tested it extensively. And I take the PCL (visible via Nvidia’s “FrameView” software) as a guideline for myself. I can see that I find the lag tolerable up to approx. 30-35. In Alan Wake, a slow single-player game, 40-50 is also possible, so when I’m in these ranges it feels OK to me. But it feels even better when you also run at 240Hz! Other testers have also found this to be the case. So it’s a combination of the FPS level with upscaling and the Hz of the display. As I said. A PCL of 25 feels worse at 100 FPS than 25 at 240FPS on 240Hz! That’s exactly the point. Lower FPS automatically increase the PCL, precisely because the FPS are lower. So with higher FPS you automatically have better latency. This means that if you already have high FPS natively + upscaling, the latency is also good. If you then use FG, the latency increases only minimally but you get the full Hz and therefore a very fast picture. It’s hard to explain but I’ve tested it. Darktite now plays 2x better with FG than without! Without it, the FPS drops in action-packed fights sometimes knock the FPS down, which increases the latency. Then you have 90 FPS, i.e. 90Hz, which feels slower than 144Hz or 240Hz with a latency of 30, for example, simply because of the lower FPS. With FG, the 240Hz is better, which makes the picture faster and the latency increases from 90FPS e.g. 30 to 35 but at 1XX FPS and with FG 4x then to 36 but at 3XX FPS! This is exactly my point and I have tested it extensively. So if the basic FPS (native + upscalling) are already high, FG only increases the PCL very slightly and I feel that the extension of the Hz captures this increase very well.
All hard to explain but I have been actively gaming for many years so I know what you mean. But the right combination of hardware and settings can also work really well with FG! I wouldn’t have thought so either! FG on my old 144Hz 21/9 Full HD with a 4070 Ti was unthinkable. But now, it really runs best like this! That’s why I wanted to test 4x.
Not inaccurate. But I did investigate further, so without switching to DLSS4 my game never crashed. But with it did, but only because of OC’s, Im guessing DLSS4 is sensitive to OC’s causing Device Hung DXGI.
It’s unfortunate, but I had to default to stock clocks to be able to use DLSS4. Fortunately I can create individual profiles for games so created one separate so i can retain the OCs on other games.
Huh, good to know, thanks for the update.
Are you OCing mem too ? In the past I had certain titles (RT especially) that were buggier when I did so but were fine with core clocks boosted.
I ended up just undervolting to keep things cool and quiet w/ some occasional OC profiles I never used.
OCing is very variable with game and RT as factor.
For example in raster benching you find the best OC
But then at that same OC you won’t make it past a benchmark that involves RT.
3D mark:
Steel Nomad - Raster
Port Royal - RT
Port Royal out of 3D Mark suite I would say is best to test stability, even if you don’t use RT because more and more games “might” have and force RT, so at best you can only set RT to low in graphic settings. Example of this is Indiana Jones game.
So I bet if RT turned off in Darktide you’ll get more stable and higher OC.
Then you got Monster hunter games - Very sensitive to OC to the point you might have to dial back to 1/4 or 1/3 of your memory OC.
There is a benchmark free download for latest Monster hunter game on steam and you’ll see what I mean.
Nvidia GPUs are so overtuned it is like they’re overclocked at factory - That’s why most is 5% OC but exponentially increased power usage, so most people go for UV for 70-80% power usage and lose only 5-10% performance.
There are scenarios you can achieve overclocks nearer to 12-15% to almost next tier of GPU but at excessive power usage beyond that next tier GPU.
This is not an FS issue, the most they can do is update the 3rd party DLLs periodically with game updates, which some games do occasionally, but technically this would also require a QA pass even though they have been drop-in upgrades for a while. Odds are there will be another big official update with multi-frame generation (MFG) as was showcased on nV live reveal and site, this would update all the DLSS and StreamLine files.
It IS however nVidia’s responsibility to preserve these settings whenever possible barring clean installs (and DDU). It seems like a bug/teething problem unless there are deeper underlying tech reason for why the setting couldn’t be preserved.
The whole point behind this new nV App+Drver feature is to override and use the latest DLLs at the “official” nV level without the game dev being involved or using other external tools like DLSS Swapper.
It’s even still possible the versions overridden will be more recent than any game that’s already shipped, as was the case at launch of this new feature.
I have tested every known trick out there to force 4x MFG to Darktide, it doesn’t seem to work. Always will run at 2x.
@FatsharkJulia Can you confirm if supporting MFG is in the plans? Even with disabling of the whitelisting of NvAPP and forcing it it doesn’t seem to trigger for Darktide.
Meaning something else is either too old or flat out not supporting it.
Hi Julia, I force-enabled multi-frame generation (MFG) in Darktide and confirmed its success through testing, demonstrating that MFG can indeed be activated in the game. Read my newly published post on the forum. Hopefully MFG will be added to the game’s built-in options soon and receive official optimization from the developers.