Reputation: 873
I wonder if calling updateGL() in fixed timer intervals may slow down the rendering process. So, I want to try making the render real time. I do not the function to make it execute automatically. Anyone knows that?
secondly, I commented updateGL() in my program, CPU usage reduced to 1% and 0%. Now, when I click on the display widget it jumps to 14%, why? isn't GPU doing all the rendering job? if yes why CPU jumps to 14? how can I resolve this?
Upvotes: 10
Views: 7464
Reputation: 7748
Yes, you can make it realtime, with no tearing (e.g., redrawing at exactly 60fps, the refresh rate of your monitor).
For this, you have to enable V_sync, and using a QTimer timer;
of interval 0. Enabling V_sync will make the automatically called swapBuffers()
command to CPU-wait for the vertical refresh signal from your monitor. Then, the timer will actually be synchronized with the monitor refresh rate.
A related info can be found here: https://www.qt.io/blog/2010/12/02/velvet-and-the-qml-scene-graph. Note the QGLFormat::setSwapInterval(1)
to enable V_sync, if not done automatically by your driver settings:
class MyGLWidget: public QGLWidget
{
// ...
private:
QTimer timer;
}
QGLFormat desiredFormat()
{
QGLFormat fmt;
fmt.setSwapInterval(1);
return fmt;
}
MyGLWidget::MyGLWidget() :
QGLWidget(desiredFormat())
{
// Configure the timer
connect(&timer, SIGNAL(timeout()), this, SLOT(updateGL()));
if(format().swapInterval() == -1)
{
// V_blank synchronization not available (tearing likely to happen)
qDebug("Swap Buffers at v_blank not available: refresh at approx 60fps.");
timer.setInterval(17);
}
else
{
// V_blank synchronization available
timer.setInterval(0);
}
timer.start();
}
In parallel, you can run a QElapsedTimer
to measure how much time has passed between two drawing (normally approx 16.6ms), and use this information to update your scene, for instance.
Upvotes: 13