Reputation: 60868
I've got a WebGL scene, and a set of parameters I can use to balance render quality versus speed. I'd like to display the scene to the user in as high as I can make the quality, as long as the frame rate doesn't drop below some threshold due to this. To achieve this, I have to somehow measure “current” frame rate in response to changes in quality.
But the scene is static as long as the user doesn't interact with it (like rotating camera using the mouse). I don't want to have a loop re-rendering the same scene all the time even if nothing changes. I want to stop rendering if the scene stops moving. Which means I can't simply average the time between successive frames, since I can't distinguish between the renderer being slow and the user just moving his mouse more slowly.
I thought about rendering the scene a number of times at start up, and judge the frame rate from this. But the complexity of the scene might change over time, due to the portion of the scene visible from the current camera position, or due to user interaction outside the canvas. So I have to adapt the quality as the scene changes complexity. Running a calibration loop after every mouse release would perhaps be an option.
I also thought about using the finish
call instead of the flush
call to accurately measure render time. But while I wait for GL to finish rendering, my application will essentially be unresponsive, in particular won't be able to queue mouse events. Since I envision the rendering to ideally take up all the time between two frames at the target threshold frame rate, that would probably be rather bad. I might get away with using finish
instead of flush
only on some occasions, like after a mouse release.
What's the best way to achieve a desired frame rate in a WebGL application, or to measure render time on a regular basis?
Upvotes: 1
Views: 817
Reputation: 2747
Why can't you use the average render time?
Just because you render less often, does not mean you cannot average the render times. It will just take a bit longer to get an accurate average.
Once the user starts moving their mouse you'll get an influx of data, and that should quickly give you an average render rate anyway.
Upvotes: 1