Hoài Vũ Lê
Hoài Vũ Lê

Reputation: 75

CIFilter on AVPlayerItem: Chroma key filter make pixels become black instead of transparent

I followed this document to make a chroma key filter for AVPlayerItem on iOS. I want all pixels that match a condition become transparent. For now, the condition is whether the pixel has hue value between 0.3 and 0.4 (green pixels).

My filter:

- (CGFloat) hueFromRed:(CGFloat)red green:(CGFloat)green blue:(CGFloat)blue {
    UIColor* color = [UIColor colorWithRed:red green:green blue:blue alpha:1];
    
    CGFloat hue, saturation, brightness;
    [color getHue:&hue saturation:&saturation brightness:&brightness alpha:nil];
    
    return hue;
}

- (CIFilter<CIColorCube> *) chromaKeyFilterHuesFrom:(CGFloat)minHue to:(CGFloat)maxHue {
    const unsigned int size = 64;
    const size_t cubeDataSize = size * size * size * 4;
    NSMutableData* cubeData = [[NSMutableData alloc] initWithCapacity:(cubeDataSize * sizeof(float))];
    
    for (int z = 0; z < size; z++) {
        CGFloat blue = ((double)z)/(size-1);
        for (int y = 0; y < size; y++) {
            CGFloat green = ((double)y)/(size-1);
            for (int x = 0; x < size; x++) {
                CGFloat red = ((double)x)/(size-1);
                
                CGFloat hue = [self hueFromRed:red green:green blue:blue];
                float alpha = (hue >= minHue && hue <= maxHue) ? 0 : 1;
                float premultipliedRed = red * alpha;
                float premultipliedGreen = green * alpha;
                float premultipliedBlue = blue * alpha;
                [cubeData appendBytes:&premultipliedRed length:sizeof(float)];
                [cubeData appendBytes:&premultipliedGreen length:sizeof(float)];
                [cubeData appendBytes:&premultipliedBlue length:sizeof(float)];
                [cubeData appendBytes:&alpha length:sizeof(float)];
            }
        }
    }

    CIFilter<CIColorCube> *colorCubeFilter = CIFilter.colorCubeFilter;
    colorCubeFilter.cubeDimension = size;
    colorCubeFilter.cubeData = cubeData;
    return colorCubeFilter;
}

In ViewController, I created a button to start the video player and apply the filter to AVPlayerItem. [self chromaKeyFilterHuesFrom:0.3 to:0.4] means filtering the green pixels:

- (AVVideoComposition*) createVideoComposition:(AVPlayerItem *)_playerItem {
    CIFilter<CIColorCube>* chromaKeyFilter = [self chromaKeyFilterHuesFrom:0.3 to:0.4];
    AVMutableVideoComposition *composition = [AVMutableVideoComposition videoCompositionWithAsset: _playerItem.asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *_Nonnull request) {
        CIImage *image = request.sourceImage.imageByClampingToExtent;
        [chromaKeyFilter setValue:image forKey:kCIInputImageKey];
        CIImage *output = [chromaKeyFilter.outputImage imageByCroppingToRect:request.sourceImage.extent];
        [request finishWithImage:output context:nil];
    }];
    
    return composition;
}

- (IBAction)playVideo:(id)sender {
    NSString *path = [[NSBundle mainBundle] pathForResource:@"my_video_has_green_pixels_zone" ofType:@"mp4"];
    NSURL *url = [NSURL fileURLWithPath:path];
    AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:url];
    playerItem.videoComposition = [self createVideoComposition:playerItem];
    
    AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
    player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
    
    AVPlayerViewController *controller =  [[AVPlayerViewController alloc] init];
    controller.player = player;
    controller.view.frame = self.view.bounds;
    controller.view.backgroundColor = UIColor.redColor;
    [[self view] addSubview:controller.view];
    [self presentViewController:controller animated:YES completion:nil];
    
    [player play];
}

The filter works "fine", except that the greens pixels become black instead of transparent. I can't figure out why or what makes those pixels become black.

UPDATE

I tried Frank's suggestion and changed from AVPlayerViewController to AVPlayerLayer, it's now working.

NSDictionary* pixelBufferAttributes = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };

AVPlayerLayer *layerPlayer = [AVPlayerLayer playerLayerWithPlayer:player];
    layerPlayer.videoGravity = AVLayerVideoGravityResizeAspect;
    layerPlayer.frame = self.view.frame;
    layerPlayer.pixelBufferAttributes = pixelBufferAttributes;
    [self.view.layer addSublayer:layerPlayer];

However, this is still not what I want since I have to use AVPlayerViewController.

Upvotes: 2

Views: 104

Answers (1)

Frank Rupprecht
Frank Rupprecht

Reputation: 10383

The pixel buffers used by the AVPlayerViewController probably don't support the alpha channel by default (to save memory). But you can change the pixel format like this:

AVPlayerViewController *controller =  [[AVPlayerViewController alloc] init];
controller.pixelBufferAttributes = @{kCVPixelBufferPixelFormatTypeKey: kCVPixelFormatType_32BGRA};

Upvotes: 1

Related Questions