Reputation: 3823
I’m trying to create a video from images using AVFoundation. There are already multiple threads regarding this approach, but I believe many of them have the same issue as I’m a facing here.
The video plays fine on the iPhone but it doesn’t play on the VLC for example, neither it play correctly on Facebook and Vimeo (sometimes some frames are out of sync). The VLC says that frame rate of the video is 0.58 fps, but it should be more then 24 right?
Does anyone know what is causing this kind of behavior?
Here is the code used to create a video:
self.videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:videoOutputPath] fileType:AVFileTypeMPEG4 error:&error];
// Codec compression settings
NSDictionary *videoSettings = @{
AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : @(self.videoSize.width),
AVVideoHeightKey : @(self.videoSize.height),
AVVideoCompressionPropertiesKey : @{
AVVideoAverageBitRateKey : @(20000*1000), // 20 000 kbits/s
AVVideoProfileLevelKey : AVVideoProfileLevelH264High40,
AVVideoMaxKeyFrameIntervalKey : @(1)
}
};
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
videoWriterInput.expectsMediaDataInRealTime = NO;
[self.videoWriter addInput:videoWriterInput];
[self.videoWriter startWriting];
[self.videoWriter startSessionAtSourceTime:kCMTimeZero];
[adaptor.assetWriterInput requestMediaDataWhenReadyOnQueue:self.photoToVideoQueue usingBlock:^{
CMTime time = CMTimeMakeWithSeconds(0, 1000);
for (Segment* segment in segments) {
@autoreleasepool {
UIImage* image = segment.segmentImage;
CVPixelBufferRef buffer = [self pixelBufferFromImage:image withImageSize:self.videoSize];
[ImageToVideoManager appendToAdapter:adaptor pixelBuffer:buffer atTime:time];
CVPixelBufferRelease(buffer);
CMTime millisecondsDuration = CMTimeMake(segment.durationMS.integerValue, 1000);
time = CMTimeAdd(time, millisecondsDuration);
}
}
[videoWriterInput markAsFinished];
[self.videoWriter endSessionAtSourceTime:time];
[self.videoWriter finishWritingWithCompletionHandler:^{
NSLog(@"Video writer has finished creating video");
}];
}];
- (CVPixelBufferRef)pixelBufferFromImage:(UIImage*)image withImageSize:(CGSize)size{
CGImageRef cgImage = image.CGImage;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
size.width,
size.height,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef) options,
&pxbuffer);
if (status != kCVReturnSuccess){
DebugLog(@"Failed to create pixel buffer");
}
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, 2);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(cgImage), CGImageGetHeight(cgImage)), cgImage);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
+ (BOOL)appendToAdapter:(AVAssetWriterInputPixelBufferAdaptor*)adaptor
pixelBuffer:(CVPixelBufferRef)buffer
atTime:(CMTime)time{
while (!adaptor.assetWriterInput.readyForMoreMediaData) {
[[NSRunLoop currentRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:0.1]];
}
return [adaptor appendPixelBuffer:buffer withPresentationTime:time];
}
Upvotes: 1
Views: 2298
Reputation: 2495
Looking at the code, I think the problem is the way you are using the Timestamps...
CMTime consists of a Value and a Timescale. The way I think of this is to treat the timescale portion as essentially the frame rate (this is inaccurate, but a useful mental tool that works well enough for what you're trying to do I think).
The first frame of video at 30FPS would be:
CMTimeMake(1, 30);
Or the 60th frame at 30 frames per second, coincidentally, this is also (60 divide by 30) the 2 second point of your video.
CMTimeMake(60, 30);
You're specifying 1000 as the timescale, which is way higher than you need. In the loop, you appear to be putting the frame then adding a second and putting another frame. This is what's getting your 0.58 FPS... (although I would have expected 1 FPS, but who knows exactly the intricacies of the codecs).
Instead, what you want to do is loop 30 times (if you want the image to show for 1 second / 30 frames), and put the SAME image on each frame. That should get you to 30 FPS. Of course, you can use a timescale of 24 if you want 24FPS, whatever suits your requirements.
Try re-write this section of your code:
[adaptor.assetWriterInput requestMediaDataWhenReadyOnQueue:self.photoToVideoQueue usingBlock:^{
CMTime time = CMTimeMakeWithSeconds(0, 1000);
for (Segment* segment in segments) {
@autoreleasepool {
UIImage* image = segment.segmentImage;
CVPixelBufferRef buffer = [self pixelBufferFromImage:image withImageSize:self.videoSize];
[ImageToVideoManager appendToAdapter:adaptor pixelBuffer:buffer atTime:time];
CVPixelBufferRelease(buffer);
CMTime millisecondsDuration = CMTimeMake(segment.durationMS.integerValue, 1000);
time = CMTimeAdd(time, millisecondsDuration);
}
}
[videoWriterInput markAsFinished];
[self.videoWriter endSessionAtSourceTime:time];
[self.videoWriter finishWritingWithCompletionHandler:^{
NSLog(@"Video writer has finished creating video");
}];
}];
As something more like this :
[adaptor.assetWriterInput requestMediaDataWhenReadyOnQueue:self.photoToVideoQueue usingBlock:^{
// Let's start at the first frame with a timescale of 30 FPS
CMTime time = CMTimeMake(1, 30);
for (Segment* segment in segments) {
@autoreleasepool {
UIImage* image = segment.segmentImage;
CVPixelBufferRef buffer = [self pixelBufferFromImage:image withImageSize:self.videoSize];
for (int i = 1; i <= 30; i++) {
[ImageToVideoManager appendToAdapter:adaptor pixelBuffer:buffer atTime:time];
time = CMTimeAdd(time, CMTimeMake(1, 30)); // Add another "frame"
}
CVPixelBufferRelease(buffer);
}
}
[videoWriterInput markAsFinished];
[self.videoWriter endSessionAtSourceTime:time];
[self.videoWriter finishWritingWithCompletionHandler:^{
NSLog(@"Video writer has finished creating video");
}];
}];
Upvotes: 5