Issue Description:
My minimum FPS has been about 30, my maximum hovers between 55 and 65. This seems rather low for my hardware; I can run most other games on Ultra quality around 100FPS, and on medium/high at a capped 120.
Attempted Solutions:
I’ve altered my video settings up and down, with ‘low’ making things better, but at Medium the game struggles during areas with fire, smoke, heat waves, and other VFX. Drivers are the latest nVidia GeForce release.
EDIT: I’d like to note that while my console log shows me using an Intel Iris GPU, my laptop has dynamic GPU switching and is most assuredly using the 3080.
Platform:
Steam
PC Specifications:
MSI Vector GP76
nVidia GTX 3080 8GB
Intel Core i7-12700H
32GB DDR4 SDRAM
App installed on a Samsung 980 PCIe SSD
I have exactly the same problem!
The performance was already bad in the closed beta but now the game is unplayable!
When I rotate my field of view, the FPS drops from 144 FPS (limited via Nvidia driver) to as low as 12 FPS!
The jerking is so bad that I get dizzy while playing and I can’t run around stairs or hit opponents.
So the game is unplayable for me, even on low setting without raytraycing or bloom effects etc!
Driver updated today to version 526.98
Nividia GeForce RTX 3080
Play at resolution 3440x1440
AMD Ryzen 9 5900X 12-Core Processor 3.70 GHz
32GB DDR4 SDRAM
Windows 10 Home @ Version 22H2
Game und System are installed @ PCIe MP600 NVMe M.2 SSD
You might want to disable your integrated graphics in order to make sure that the game is using the nvidia gpu, I had to do this for some games when I had a laptop. Also, keep in mind that a laptop rtx 3080 is nowhere near as fast as a desktop rtx 3080.
I’m aware of this. The integrated chipset is not being used. Also, I’m aware of notebook GPU performance. I’m still able to run other titles at much higher framerates at much higher settings; this poor of performance is very unusual for my hardware.
I’m also experiencing quite bad performance on my 3090. Ironically the game actually seems to perform better on older gen Nvidia cards. I have a rig with a 1080 that gets pretty solid framerates on medium.
I think something is just really messed up with 3000+ line cards and this game
I also specifially upgraded before Darktide (because super hyped), from a 1070Ti to a 3080 (MSI Gaming Trio Z RTX 3080) and the performance vs. quality I can achieve are very questionable.
On medium raytracing and dlls on auto, deactivating or lowering some GPU heavy features (like vol lights, etc.) and generally have a “medium” overall config, I can barely reach 60 fps. And as soon as there is fighting with hordes going on it’s unplayable.
Not a Notebook GPU, no onboard or CPU graphics enabled/available…
Can it be that a lower tier CPU (i5-10400F) is the bottleneck?
Can confirm 3080/5600x also doesn’t preform great, medium settings with no RTX and all the bs turned off to even keep above 60 (most of the time) Tried DLSS on most settings, also fsr2.0 but nothing amazing in terms of results.
An i5-6600 is specified as the minimum requirement in the list.
Here is a comparison between your mentioned CPU and the CPU from the minimum requirement:
So in my opinion, your CPU slightly exceeds the minimum requirement and should be able to handle the game depending on your used settings, resolution, background services etc.
However, it is not optimal in terms of the overall balance of your system.
The best thing to do is to take a look at your CPU loads during the game using the appropriate tools.
Pretty unreal that I can have a $2,400 PC with a 3080, can get 120fps WITH RAY TRACING in Control on a 1440p ultrawide, but Darktide can barely push 45fps with medium settings and RT on… How poorly optimized could this be? Isn’t DLSS supposed to fix this? Cyberpunk DLSS clearly and accurately scales the rendering so that I can get 90fps consistently with RT on and all settings maxed. How can Darktide run on lower settings with ~half the performance? There’s an optimization issue here.
Turning off RT entirely bumps the fps up to ~90 with dips in the 60s as @SosoDeSamurai mentioned. But why on earth did I spend this kind of money on my hardware if the game is virtually unplayable on it?
Edit: for even more baffling context, I run Elden Ring in ultrawide, fully maxed settings, with ReShade ADDING post processed ray tracing, and even there I get 100fps consistently. In a massive open-world game with extremely far draw distances and incredible ambient physics, I get considerably better performance. Elden Ring doesn’t even have DLSS…
Ok here is a first little try to improve the problem a bit!
It helped me to reinstall the driver completely (set hook as option during installation) and after the reinstalltion I did not reactivate the Shadowplay function from nividea!
This seems to help at least in so far that the frames now no longer fall so often and permanently below 30 and the game runs in something like in the closed beta again.
Can you possibly also readjust this?