CPP_Newb69
CPP_Newb69

Reputation: 65

C++ Incorrect FPS and deltaTime measuring using std::chrono

The fps of my program is incorrect. When I calculate the fps of my application using RivaTuner statistics, it gives for example 3000 fps. But my program calculates a really different number, like 500. It is going up and down all time while Rivatuner does not.

This is how I calculate the deltatime(deltaTime variable is a float):

std::chrono::high_resolution_clock timer;
auto start = timer.now();

...doing stuff here...

auto stop = timer.now();
deltaTime = std::chrono::duration_cast<std::chrono::microseconds>(stop - start).count() / 1000.0f; //DELTATIME WAS LESS THAN 1 MILLISECOND SO THAT IS WHY I USED THIS

This is how I calculate the fps:

float fps = (1.0f / deltaTime) * 1000.0f;

I multiply my game speeds with the deltaTime variable, but because it is doing weird things(going up and down really fast the whole time) that is screwed up too. So for example, my RivaTuner says 2000 fps my game is running slower than when it says 4000 fps.

But when the application runs slower it needs more time to render 1 frame(so, a higher deltaTime, so a higher game speed?).

Is this correct?

Thanks in advance.

Upvotes: 1

Views: 10449

Answers (2)

eerorika
eerorika

Reputation: 238461

[My fps counter] is going up and down all time while Rivatuner does not.

Typically, rendering and other calculations take a variable amount of time. If you calculate the fps every frame, then it's expected to go up and down.

deltaTime = std::chrono::duration_cast<std::chrono::microseconds>(stop - start).count() / 1000.0f;

Don't do that. If you want a floating point value of the milliseconds with minimal loss of precision, then do this:

using ms = std::chrono::duration<float, std::milli>;
deltaTime = std::chrono::duration_cast<ms>(stop - start).count();

But when the application runs slower it needs more time to render 1 frame

Correct.

so, a higher deltaTime

Correct.

so a higher game speed?

The rendering speed shouldn't affect the speed of the game if everything is scaled in relation to the passed time. Whether it does affect the speed is impossible to tell without knowing what your game does.

If it does affect the speed of the game, then there might be something wrong with how you implemented the game. If you have behaviour that is sensitive to the length of the time step, such as physics, then those calculations should be done with a fixed time step. For example, 120 times a second. If your fps is higher, then skip advancing the simulation and if your fps is lower, then repeat the simulation.

Upvotes: 1

Tal
Tal

Reputation: 357

Just like JSQuareD said, when calculating FPS, you should take the average after measuring many frames. The reason is that frames execution speed tend to be very different, due to many reasons.

Sum over you measurements over, lets say 0.5 seconds and calculate the average. Yes this is this dumb as it sounds.

But you should be careful on this FPS statistics - you could have even 60 FPS and the game would still looks stuck. Why? Because few frames took really long delta time, and most frames took very fast delta time. (It happens more than it sounds)

You can solve last problem by viewing graph or calculate standard deviation, but this is more advanced concern for now.

Upvotes: 1

Related Questions