Reputation: 1996
I have a CADisplayLink, and a AVPlayerItemVideoOutput under iOS 6.1. I now have a CVPixelBufferRef and want to modify it (e.g. display a timecode on the video, etc). I think I have the modify part implemented correctly and now have a CGImageRef or a CGContextRef with the new frame.
My question now: How am I supposed to display that? I found special classes for OS X, but none for iOS. Are there any? Can I use the GPUImage Framework as a shortcut? (But the dedicated GPUImage Movie player doesn't do audio :( ).
( Is there any good book for Advanced AVFoundation? )
Here is my code, but the drawn text isn't shown on the video (video and audio just plays nicely). The code is called (I checked) via the CADisplayLink Callback.
CVPixelBufferRef ref = [_playerItemVideoOutput copyPixelBufferForItemTime:_playerItem.currentTime itemTimeForDisplay:nil];
size_t width = CVPixelBufferGetWidth(ref);
size_t height = CVPixelBufferGetHeight(ref);
size_t bytes_per_row = CVPixelBufferGetBytesPerRow(ref);
CVPixelBufferLockBaseAddress(ref, 0);
void *pxdata = CVPixelBufferGetBaseAddress(ref);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, width,
height, 8, bytes_per_row, rgbColorSpace, kCGImageAlphaNoneSkipFirst);
CGContextRetain(context);
CGContextSelectFont(context, "Helvetica", 10, kCGEncodingMacRoman);
CGContextSetTextDrawingMode(context, kCGTextFill);
CGContextShowTextAtPoint(context, 0.0, 0.0, "O BROTHER WHERE ARE THOU?", 20);
CGColorSpaceRelease(rgbColorSpace);
CVPixelBufferUnlockBaseAddress(ref, 0);
CGContextRelease(context);
CVBufferRelease(ref);
Edit: My Problem is/was, that you can't display the modified pixel buffer with an AVPlayerLayer. You have to output it via other methods (on the Mac there is a AVSampleBufferOutput, but no public equivalent on iOS :/ ).
Upvotes: 3
Views: 4700
Reputation: 19641
I think you asked the wrong question. It seems you want to process a video in real time but then you ask about CGImageRef
?
I recommend you to study Session 517 of WWDC 2012. There is sample code for OSX and iOS, download all [related] AVFoundation samples and check them out. Search for the new iOS 6 methods to narrow your quest.
Once you have the code in place (basically a CADisplayLink
callback where you'll extract pixel buffers with copyPixelBufferForItemTime:itemTimeForDisplay:
at the right times) you can modify the pixel buffer with GPUImage, Core Image or just your own OpenGL code. All this must be done when video is about to play in AVPlayer
, otherwise you only will be able to display images but not audio. At least not synchronized audio.
I may help you more if you ask something specific.
Upvotes: 3