Reputation: 908
I'm finally starting to play with Swift (and macOS development) for the first time. I'm trying to write a simple screen capture app to get started. I've already taken a look at and succeeded in using the AVFoundation APIs for doing this (AVCaptureSession
, AVCaptureScreenInput
, etc). But now I'd like to attempt to go a little lower-level and play with the closer-to-the-metal CGDisplayStream
API.
Unfortunately I've only been able to get it to capture a single frame. I suspect I may be missing something regarding potential interaction between the main Runloop and the DispatchQueue I'm passing in? Not really clear on if those things interact in the first place.
Here's a small reproduction of my issue:
import Foundation
import AVFoundation
import CoreGraphics
let mainDisplay = CGMainDisplayID()
let displayBounds = CGDisplayBounds(mainDisplay)
let recordingQueue = DispatchQueue.global(qos: .background)
let displayStreamProps : [CFString : Any] = [
CGDisplayStream.preserveAspectRatio: kCFBooleanTrue,
CGDisplayStream.showCursor: kCFBooleanTrue,
CGDisplayStream.minimumFrameTime: 60,
]
let displayStream = CGDisplayStream(
dispatchQueueDisplay: mainDisplay,
outputWidth: Int(displayBounds.width),
outputHeight: Int(displayBounds.height),
pixelFormat: Int32(kCVPixelFormatType_32BGRA),
properties: displayStreamProps as CFDictionary,
queue: recordingQueue,
handler: { status, displayTime, frameSurface, updateRef in
print("is only called once")
}
)
func quit(_ : Int32) {
displayStream?.stop()
}
signal(SIGINT, quit)
displayStream?.start()
RunLoop.current.run()
Any help would be massively appreciated!!
Upvotes: 3
Views: 1147
Reputation: 2125
Removing this line seems to fix the issue:
CGDisplayStream.minimumFrameTime: 60,
The docs don't mention what the unit for this "time" field is, but it appears to be in seconds. So you could change it to 1.0/60.0
for 60fps capture.
Upvotes: 4