Reputation: 548
I'm using GPUImage on iOS. I'm having trouble transforming (scaling) a GPUImagePicture with GPUImageTransformFilter and blending it into a video.
overlayImage = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"Sticker"] smoothlyScaleOutput:YES];
blendFilter = [GPUImageAlphaBlendFilter new];
transformFilter = [GPUImageTransformFilter new];
// [transformFilter forceProcessingAtSize:?????];
transformFilter.affineTransform = CGAffineTransformMakeScale(0.2f, 0.2f);
[overlayImage addTarget:transformFilter];
[videoCamera addTarget:blendFilter];
[overlayImage addTarget:blendFilter];
[overlayImage processImage];
[videoCamera startCameraCapture];
The incorrect result I'm getting is that the image shows un-transformed - it's still at full size, as if the transform filter was never applied.
Is it correct that the transformed GPUImagePicture will need to be processed at the exact same size as the other contents of the blend filter?
If so, how do I do this? Should I use forceProcessingAtSize? And can I obtain the size by querying something, like the videoCamera's session? I tried setting forceProcessingAtSize to be the size of the AVCaptureSessionPreset, 640x480, but this didn't help.
Thanks
Upvotes: 0
Views: 384
Reputation: 170317
You haven't connected the transformFilter
in the above code to anything, so it is being ignored. I believe you need to correct your code to read
[overlayImage addTarget:transformFilter];
[videoCamera addTarget:blendFilter];
[transformFilter addTarget:blendFilter];
if you want your transformed results to be blended.
forceProcessingAtSize
won't really do what you want here, because that adjusts the underlying pixel size and not how its presented for a blend. Blend filters use the aspect ratio of the first image and then stretch the second image to fit that. It's a consequence of the fairly simple normalized texture coordinates I use, and I've unfortunately never added the options to preserve aspect ratio in the second image.
Upvotes: 1