Everything in the game runs fine. My graphics card (overclocked GTX 980 Ti @ 1500 MHz; equivalent to a GTX 1070 AIB OC) does not even sweat running the game at the custom settings I use. I like to play with vsync on my 60 Hz monitor. GPU usage hovers between 40% to 70% at locked 60 fps. I can maintain that with almost all characters except with sienna with a beam staff. Playing with sienna with beam causes me to have drops below 60 frequently when I blast through hordes (they light up). Even with particle effects set to ālowā, blasting through a crowd of enemies causes the GPU usage to rise from a mere 50%-ish to 99% instantly (indicating that the graphics card becomes bottleneck during such fps drops when blasting at the hordes). Iāve had this problem ever since game release. Despite 1.2 talking about doing GPU optimization of particle effects, the beam blasts lighting hordes of enemies still causes great fps dips below 60. A heavily OCed 980 Ti not being able to maintain at least a 60 fps during blasting on ālowā particle effect settings is really frustrating 6 months after release. We appreciate the various CPU optimization in the game, but also please look into GPU optimization especially related to beam blasts/beam lighting up on heavy hordes.
my game performance has definately tanked after this DLC. In fact if Iām at a ledge meleeing stuff to death and thereās a sienna beam staffing the horde as it comes up, I am literally blinded by the amount of light particle effects going off
Weāre in the process of looking in to the performance drops, particularly with Sienna. Thank you for your report.
Isnāt lighting dynamics handled by the CPU? Shadows from it and so on? At least thatās what it says in the GFX menu. I have a really good GPU, but an older CPU. Iāve had to turn down lighting effects or my CPU bottlenecks.
CPU has things to do with any graphics or 3D rendering. But, nothing is handled by the CPU or GPU alone; its not a either or thing. They always work together. For any graphics, CPU sends drawcall commands to the GPU, the GPU draws up and renders things. They have their own respective part of working to do in frame generation. Much of their workload is parallelized, but they do not necessarily finish at the same time. Often, one has to wait for the other to finish working its part before both their parts are done and the frame is complete. So, there is always a bottleneck in the pipeline- its either the CPU or the GPU.
Here is the rough understanding that I have. The time it takes for a frame to be completed is called frame-time. Itās usually close or equal to the time spent by the last component to complete its work.
Scenario 1:
CPU time = 8 ms
GPU time = 16 ms
Frame time = 16 ms
GPU took longer to finish its work, therefore GPU is bottleneck for this frame.
Scenario 2:
CPU time = 17 ms
GPU time = 5 ms
Frame time = 17 ms
CPU took longer to finish its work, therefore CPU is bottleneck for this frame.
Depending on what system you have, you could have either CPU or GPU bottleneck. What is a CPU bottleneck on your system, could be a GPU bottleneck on mine. But, devs always target a criteria. Ideally, all devs want their games to be GPU bottlenecked. Itās quite hard to maintain this target on many mainstream systems for the devs in this game, because its generally much more CPU dependent than other linear games (with all the horde drawcalls, networking, AI pathing and game complexity in this game). Thatās why much of their optimization effort has been focused on reducing CPU dependency in the past few patches. And, they have done quite a good job at that so far.
On the consumer end, we can use tools like RTSS/afterburner to detect where the bottleneck is. The key is to directly look at the GPU usage (not CPU usage, because CPU runs complex logic unlike GPU, and therefore CPU usage does not represent resource availability or work capacity as linearly as GPU usage can).
The rule of thumb is: if your GPU is maxed out in its usage, then your graphics card is the bottleneck. If it isnāt maxed out, then the bottleneck lies elsewhere (i.e., in the CPU-RAM department). I pointed out in my original post that when I observe poor frame rate when beam blasting over big hordes, my GPU usage gets maxed out to 99-100%. Meaning that my GPU is pushed to its limit there, not the CPU/RAM. Had I had a 1080 Ti, I wouldnāt have had this drop, of course. But, thatās not the point. The point is maxing out a 1070/1070 Ti level graphics card at 1080p at ālowā particle quality settings BELOW 60 fps is not OK. They need to look at GPU side of handling and optimization of particle effects associated with siennaās beam blasting and horde lighting as well.
I was just pointing out that for many of the lighting effects in the GFX menu, when you hover your mouse over them, it says they are CPU High and GPU Low-Medium.
I donāt have the best system, GTX 1060, 32GB Ram, but an older CPU i7. I noticed that lowering the CPU depend settings improved the FPS for me, especially during hordes xD
The game might also just not be optimized that great. My GPU sits around 50-75% usage most of the time, but can spike. My CPU sits at 24% or so and spikes during hordes obviously. I played a game a few months ago, Dreadnought, and it was optimized extremely badly, just sitting in the hanger(Think the keep in this game) would cause the GPU and CPU to hit max usage, 99% constantly, meanwhile, you enter a game and itās only using 20-30%. Anyway, good luck.
Those GFX menu tool-tips are all relative. Shadows take a lot of drawcalls than say volumetric fog or alpha-effects. Thatās why shadows would be considered CPU heavy, alpha-effects would be considered GPU heavy. But, that does not mean GPU has no part in CPU heavy graphics or vice versa.
I too used to have this issue running a GTX 780ti with i7 960 and 16gb RAM.
turned off vsync and it made a big difference.
Sucks cause i hate screen tearingā¦so i bought a new rig (gtx1080 i5 8400)with slick 165hz monitor and gsync. Playing at > 60hz is so smooth i canāt believe itās not butter.
The problem mentioned has nothing to do with vsync though. It persists with or without vsync. Wit vsync off, I would get 140+ fps without a problem UNTIL I start beam-blasting through an oncoming wave of enemies and it drops to 53 fps ā¦ ughh. And, as mentioned earlier, GPU usage is always 99%. Iām talking about a few seconds of fps drop. 99.99% of the time I would be upwards of 120-160 fps all the time in the game. But, that few seconds is a crucial moment of enemy engagement. The only solution seems to be not playing sienna at all.
While youāre at it, can you do something about Sienna staves blinding you at all times and actively contributing to the gameās built in bad āfeatureā, the iris camera effect? Iād really like a setting to disable the fire glow when non in 3rd person.
Turn off bloom in GFX settings, that fixes it for me, along with lens flare.
I spent 10 real-time hours in-game the last 2 days testing literally every Lighting feature this game has. Literally nothing can disable the Iris Effect on Vermintide 2. If you scan the Reddit and FatShark forum hits, multiple people can attest to this fact as well; itās hard coded into the game.
Can you give me a SS of this? Cause I donāt notice it at all? After I turned down/off some settings, I donāt see it anymore. Thatās if youāre talking about the annoying ālight circlesā and dots that appear all over your screen from fire and other effects. Even the Red weapons glow would cause it on my screen before.
Common examples are when you walk in and out of caves, or light a grenade/use a fully charged staff explosion, etc. The game is hard coded to make bright light sources bloom which diminishes colors and image fidelity outside of that space (if only temporarily). However, these moments of light changes detracts from the game and actively sabotages play potential.
If you turn literally all light settings off/disabled/Low, you will still see the blooming on light changes, but at a lower intensity, but there is definitely no way to turn them off. This is also annoying because I am forced to turn all those settings to off/Low, which means the game looks much worse as a result.
The point about staves in general being too bright in the static display is factually a problem for how human eyes work. Itās like if you were playing a videogame game with someone holding a flashlight in your eyes from the side at all times. It gets pretty annoying after awhile and makes your eyes work harder in general, which can cause fatigue/headaches.
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.