Reputation: 751
I'm using this Swift class (shown originally in the answer to this question: Capture Metal MTKView as Movie in realtime?) to try to record my Metal app frames to a movie file.
class MetalVideoRecorder {
var isRecording = false
var recordingStartTime = TimeInterval(0)
private var assetWriter: AVAssetWriter
private var assetWriterVideoInput: AVAssetWriterInput
private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor
init?(outputURL url: URL, size: CGSize) {
do {
assetWriter = try AVAssetWriter(outputURL: url, fileType: AVFileTypeAppleM4V)
} catch {
return nil
}
let outputSettings: [String: Any] = [ AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : size.width,
AVVideoHeightKey : size.height ]
assetWriterVideoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
assetWriterVideoInput.expectsMediaDataInRealTime = true
let sourcePixelBufferAttributes: [String: Any] = [
kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA,
kCVPixelBufferWidthKey as String : size.width,
kCVPixelBufferHeightKey as String : size.height ]
assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: assetWriterVideoInput,
sourcePixelBufferAttributes: sourcePixelBufferAttributes)
assetWriter.add(assetWriterVideoInput)
}
func startRecording() {
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)
recordingStartTime = CACurrentMediaTime()
isRecording = true
}
func endRecording(_ completionHandler: @escaping () -> ()) {
isRecording = false
assetWriterVideoInput.markAsFinished()
assetWriter.finishWriting(completionHandler: completionHandler)
}
func writeFrame(forTexture texture: MTLTexture) {
if !isRecording {
return
}
while !assetWriterVideoInput.isReadyForMoreMediaData {}
guard let pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool else {
print("Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame")
return
}
var maybePixelBuffer: CVPixelBuffer? = nil
let status = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)
if status != kCVReturnSuccess {
print("Could not get pixel buffer from asset writer input; dropping frame...")
return
}
guard let pixelBuffer = maybePixelBuffer else { return }
CVPixelBufferLockBaseAddress(pixelBuffer, [])
let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!
// Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let region = MTLRegionMake2D(0, 0, texture.width, texture.height)
texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
let frameTime = CACurrentMediaTime() - recordingStartTime
let presentationTime = CMTimeMakeWithSeconds(frameTime, 240)
assetWriterPixelBufferInput.append(pixelBuffer, withPresentationTime: presentationTime)
CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
}
}
I am not seeing any errors, but the frames in the resulting Quicktime file are all black. The frames are the correct size, and my pixel format is correct (bgra8Unorm). Anyone know why it might not be working?
I am calling the writeFrame function before I present and commit the current drawable, like this:
if let drawable = view.currentDrawable {
if BigVideoWriter != nil && BigVideoWriter!.isRecording {
commandBuffer.addCompletedHandler { commandBuffer in
BigVideoWriter?.writeFrame(forTexture: drawable.texture)
}
}
commandBuffer.present(drawable)
commandBuffer.commit()
}
I did get an error initially, that my MetalKitView layer was 'framebufferOnly'. So I set that to false before trying to record. That got rid of the error but the frames are all black. I also tried setting it to false at the very beginning of the program, but I get the same results.
I also tried using 'addCompletedHandler' instead of 'addScheduledHandler', but that gives me the error "[CAMetalLayerDrawable texture] should not be called after already presenting this drawable. Get a nextDrawable instead. ".
Thanks for any suggestions!
EDIT: I got this resolved with the help of @Idogy. Testing revealed that the original version worked on iOS but not Mac. He said that since I have an NVIDIA GPU, the framebuffers are private. So I had to add a blitCommandEncoder with a synchronize call on the texture, then it started working. Like this:
if let drawable = view.currentDrawable {
if BigVideoWriter != nil && BigVideoWriter!.isRecording {
#if ISMAC
if let blitCommandEncoder = commandBuffer.makeBlitCommandEncoder() {
blitCommandEncoder.synchronize(resource: drawable.texture)
blitCommandEncoder.endEncoding()
}
#endif
commandBuffer.addCompletedHandler { commandBuffer in
BigVideoWriter?.writeFrame(forTexture: drawable.texture)
}
}
commandBuffer.present(drawable)
commandBuffer.commit()
}
Upvotes: 4
Views: 1003
Reputation: 2869
I believe you are writing your frames too early -- by calling writeFrame
from within your render loop, you are essentially capturing the drawable at a time when it is still empty (the GPU just hasn't rendered it yet).
Remember that before you call commmandBuffer.commit()
, the GPU hasn't even begun rendering your frame. You need to wait for the GPU to finish rendering before trying to grab the resulting frame. The sequence is a bit confusing because you're also calling present()
before calling commit()
, but that isn't the actual order of operations in run-time. That present
call is merely telling Metal to schedule a call to present your frame to the screen once the GPU has finished rendering.
You should call writeFrame
from within a completion handler (using commandBuffer.addCompletedHandler()
). That should take care of this.
UPDATE: While the answer above is correct, it is only partial. Since the OP was using a discrete GPU with private VRAM, the CPU wasn't able to see the render target pixels. The solution to that problem is to add an MTLBlitCommandEncoder
, and use the synchronize()
method to ensure the rendered pixels are copied back to RAM from the GPU's VRAM.
Upvotes: 4