YosiFZ
YosiFZ

Reputation: 7890

UIImage to CVPixelBufferRef empty

I am using this code to create CVPixelBufferRef:

NSDictionary *videoSettings = @{AVVideoCodecKey: AVVideoCodecH264,
                                    AVVideoWidthKey: [NSNumber numberWithInt:size.width],
                                    AVVideoHeightKey: [NSNumber numberWithInt:size.height]};

self.writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                         outputSettings:videoSettings];

self.adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:self.writerInput sourcePixelBufferAttributes:nil];

CVPixelBufferRef buffer;
CVPixelBufferPoolCreatePixelBuffer(NULL, self.adaptor.pixelBufferPool, &buffer);
buffer = [self pixelBufferFromCGImage:[frame CGImage] size:self.videoSize];

this is the pixelBufferFromCGImage function:

+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
                                  size:(CGSize)imageSize
{
    NSDictionary *options = @{(id)kCVPixelBufferCGImageCompatibilityKey: @YES,
                              (id)kCVPixelBufferCGBitmapContextCompatibilityKey: @YES};
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, imageSize.width,
                                          imageSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, imageSize.width,
                                             imageSize.height, 8, 4*imageSize.width, rgbColorSpace,
                                             kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);

    CGContextDrawImage(context, CGRectMake(0 + (imageSize.width-CGImageGetWidth(image))/2,
                                       (imageSize.height-CGImageGetHeight(image))/2,
                                       CGImageGetWidth(image),
                                       CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

The problem is that the buffer is allways empty, any idea why its happen?

Upvotes: 1

Views: 1205

Answers (1)

HSG
HSG

Reputation: 1224

I guess that the buffer is always nil because the "videoSize" is not valid. I have tested your code in 3 scenarios and here is the results

The first one I passed valid parameters with an image and size 200x200. The buffer was not nil.

The second one I passed a nil for image and video size 200x200. The buffer was not nil.

The last one I passed a valid image but invalid video size 0x0 and the buffer was nil. The return status is

kCVReturnInvalidArgument
Invalid function parameter. For example, out of range or the wrong type.

Value
-6661

Description
Invalid function parameter. For example, out of range or the wrong type.

Hope this will help you.

This is the code that I tested.

- (void)viewDidLoad
{
    [super viewDidLoad];

    UIImage *image = [UIImage imageNamed:@"Doge-Meme.jpg"];
    CVPixelBufferRef bufferRef = [[self class] pixelBufferFromCGImage:image.CGImage size:CGSizeMake(200, 200)];
    NSLog(@"## %@", bufferRef);//not nil

    CVPixelBufferRef bufferRef1 = [[self class] pixelBufferFromCGImage:nil size:CGSizeMake(200, 200)];
    NSLog(@"## %@", bufferRef1);//not nil

    CVPixelBufferRef bufferRef2 = [[self class] pixelBufferFromCGImage:image.CGImage size:CGSizeMake(0, 0)];
    NSLog(@"## %@", bufferRef2);//nil
}

+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
                                      size:(CGSize)imageSize
{
    NSDictionary *options = @{(id)kCVPixelBufferCGImageCompatibilityKey: @YES,
                              (id)kCVPixelBufferCGBitmapContextCompatibilityKey: @YES};
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, imageSize.width,
                                          imageSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, imageSize.width,
                                                 imageSize.height, 8, 4*imageSize.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);

    CGContextDrawImage(context, CGRectMake(0 + (imageSize.width-CGImageGetWidth(image))/2,
                                           (imageSize.height-CGImageGetHeight(image))/2,
                                           CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

Upvotes: 2

Related Questions