bd1170
bd1170

Reputation: 13

iOS camera preview is unstable

Apple provides a sample iOS project called AVCamFilter where they use an MTKView to render the camera preview to the screen. The problem is that the frame duration - the amount of time each individual camera frame spends being displayed on the screen - is not stable. The camera stream is running at 30 FPS, that is one frame delivered every 33.3 milliseconds, so it stands to reason that we should be able to display these frames one after another with a frame duration of 33.3 ms. In reality the frame duration is unstable - most of the time it is about 33.3 ms but sometimes it is roughly 16.7 or 50.1 ms.

According to documentation the default behavior for MTKView is to draw to the screen at 60 FPS, that is one frame every 16.7 ms, so each camera frame would be drawn twice before it is replaced by the next one. The fact that MTKView’s draw loop and the camera’s capture loop are not synchronized explains the problem - the time elapsed between a draw and a capture will gradually drift causing frames to go from being captured just before a draw to just after a draw and so on and so forth resulting in some frames being displayed for three draws or one draw instead of the expected two. The MTKView draw loop can be synchronized with the camera by disabling the default behavior and manually calling draw whenever a new camera frame arrives, but all of this is still not synchronized with the device display itself which is rendering at 60 FPS in its own loop. So, we still have the same fundamental timing problem. I know CADisplayLink is meant for synchronizing things to the display but there is no way for the capture stream to use it.

So how do we render the camera preview with a stable 33.3 ms frame duration? Is the instability an expected behavior with no workaround?


Edit

Some observations on the behavior of the AVCamFilter project with different MTKView settings. In all cases the total display time of a camera frame - frame duration * draws per frame - is not stable at 33.3 ms.

Upvotes: 0

Views: 355

Answers (2)

ImShrey
ImShrey

Reputation: 418

If you want to run draw calls of MTKView at specific framerate, why not try preferredFramesPerSecond property?

As per apple doc:

When your application sets its preferred frame rate, the view chooses a frame rate as close to that as possible based on the capabilities of the screen the view is displayed on. To provide a consistent frame rate, the actual frame rate chosen is usually a factor of the maximum refresh rate of the screen. For example, if the maximum refresh rate of the screen is 60 frames per second, that’s also the highest frame rate the view sets as the actual frame rate. However, if you ask for a lower frame rate, the view might choose 30, 20, or 15 frames per second, or another factor, as the actual frame rate. Your application should choose a frame rate that it can consistently maintain. The default value is 60 frames per second.

This does solve the issue of sync as per my observation when worked on live filters on camera feed.

Also, I don't know if got your problem correctly, because in theory if you are using latest available frame from AVCapture session in your rendering cycle, no matter at what FPS it runs, you should not see any hiccups unless your rendering is not staying steadily at the same framerate, or if it goes below camera capture framerate.

Upvotes: 0

Hamid Yusifli
Hamid Yusifli

Reputation: 10137

If I understand your intention correctly, you expect the camera (AVCaptureSession) to produce sample buffer at rate of 30fps and expect MTKView to consume these images at exactly the same rate of 30fps to render them to the screen.

MTKView will not guarantee that drawInMTKView will always be called at 30 fps. For various reasons, the callback to drawInMTKView may incidentally be skipped. The Same is true for AVCaptureSession.

Upvotes: 0

Related Questions