... And I thought not being able to make bridge to rail in old quake because your FPS was too low was bad... now your guns do less damage because your FPS is too low...
Edit: So I also wanted to confirm whether the slower fire rates were also directly tied to your ability to do damage, just in case there isn't any funny prediction business on the server end mitigating this, so as expected there isn't and your damage is in fact directly effected.
So turbotortoise43 went to the trouble of doing this in a more professional manner which illustrates the same problem:
Update: I've removed the previous results and cleaned up my testing method big time from what you can see in the clip... and the results are shocking as hell. I thought we were pretty much safe playing in the 110-144 FPS region, but this is not the case.
So now as opposed to just setting out to prove that frame rate and fire rate are correlated, I set out to determine exactly what kind of fire rates we get at the most commonly set FPS and what frame rates we actually need to be getting to see the fire rates that we expect.
So I created a macro with my roccat software that simply holds the fire key in for 4000 MS. I push the button while limiting the game to different frame rates and measure how many shots we're shooting in 4000 MS and then repeat 5-7 times because there is a margin of error here of between 0 and 99 MS over 4000 MS which I see in QL as well, which results in a difference of 1 shot every now and then over the 4 seconds, so I think it's slight imprecision in the roccat software.
As a control test, in QL this test yields the same fire rates for LG regardless of whether the game is running at 30 FPS or 300 FPS.
Anyway, the results are bonkers crazy, because even at 144 FPS we're not getting our expected fire rates. If you want your expected fire rates, you need to be up in the 200+ region.
In 4000 MS.
200 FPS: 80 cells, 20 per second [140 DPS]
144 FPS: 72 cells, 18 per second [126 DPS]
120 FPS: 69 cells, 17.25 per second [120.75 DPS]
60 FPS: 61 cells, 15.25 per second [106.75 DPS]
200 FPS: 40 rounds, 10 per second [80 DPS]
144 FPS: 37 rounds, 9.25 per second [74 DPS]
120 FPS: 37 rounds, 9.25 per second [74 DPS]
60 FPS: 35 rounds, 8.25 per second [66 DPS]
200 FPS: 40 rounds, 10 per second [100 DPS]
144 FPS: 37 rounds, 9.25 per second [92.5 DPS]
120 FPS: 37 rounds, 9.25 per second [92.5 DPS]
60 FPS: 35 rounds, 8.25 per second [82.5 DPS]
200 FPS: 20 rounds, 5 per second [75 DPS]
144 FPS: 19 rounds, 4.75 per second [71.25 DPS]
120 FPS: 20 rounds, 5 per second [75 DPS]
60 FPS: 19 rounds, 4.75 per second [71.25 DPS]
200 FPS: 40 rounds, 10 per second [200 DPS]
144 FPS: 36 rounds, 9 per second [180 DPS]
120 FPS: 37 rounds, 9.25 per second [185 DPS]
60 FPS: 35 rounds, 8.25 per second [165 DPS]
Anecdotally I found during the 200 FPS tests that if I was accidentally in an area where I wasn't quite getting 200, but rather 180ish, I would still see drops in fire rates...
Consider the fact that when playing a game, with players in it, with 144hz and 120hz monitors being all the rage, that is our most common FPS, there probably isn't anybody around that is getting consistently max firing rates @ 200 FPS+.
Thinking about the effect this has particularly on LG, it's not only the drop in DPS that hurts so much, it's the fact that with less shots being sent out per second, a sweep over a target has less chance to yield a hit at all, combine that with the speedy/tiny/pixel perfect hit boxes and you have one really crappy LG.