The faster something on screen moves, the higher your framerate needs to be for a certain level of motion blur.
A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example
Framerate is inversly proportial to frametimes, which is what makes it harder to notice a difference the higher you go.
From 30 to 60? That’s an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on
Ah, something I want to add:
That’s only explaining the visual aspect, but frametimes are also directly tied to latency.
Some people might notice the visual difference less than the latency benefit. That’s the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don’t change the latency.
Two things are important here:
A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example
From 30 to 60? That’s an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on
Ah, something I want to add:
That’s only explaining the visual aspect, but frametimes are also directly tied to latency.
Some people might notice the visual difference less than the latency benefit. That’s the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don’t change the latency.