Reputation: 7383
I'm new to CoreImage / Metal, so my apologies in advance if my question is naive. I spent a week going over the CoreImage documentation and examples and I couldn't figure this one out.
Suppose I have a reduction filter such as CIAreaAverage which outputs a 1x1 image. Is it possible to convert that image into a color that I can pass as an argument of another CIFilter? I'm aware that I can do this by rendering the CIAreaAverage output into a CVPixelBuffer, but I'm trying to do this in one render pass.
Edit #1 (Clarification):
Let's say I want to correct the white balance by allowing the user to sample from an image a gray pixel:
let pixelImage = inputImage.applyingFilter("CICrop", arguments: [
"inputRectangle": CGRect(origin: pixelCoordinates, size: CGSize(width: 1, height: 1))
])
// We know now that the extent of pixelImage is 1x1.
// Do something to convert the image into a pixel to be passed as an argument in the filter below.
let pixelColor = ???
let outputImage = inputImage.applyingFilter("CIWhitePointAdjust", arguments: [
"inputColor": pixelColor
])
Is there a way to tell the CIContext
to convert the 1x1 CIImage
into CIColor
?
Upvotes: 2
Views: 449
Reputation: 10408
If you want to use the result of CIAreaAverage
in a custom CIFilter
(i.e. you don't need it for a CIColor
parameter), you can directly pass it as a CIImage
to that filter and read the value via sampler
in the kernel:
extern "C" float4 myKernel(sampler someOtherInput, sampler average) {
float4 avg = average.sample(float2(0.5, 0.5)); // average only contains one pixel, so always sample that
// ...
}
You can also call .clampedToExtent()
on the average CIImage
before you pass it to another filter/kernel. This will cause Core Image to treat the average image as if it were infinitely large, containing the same value everywhere. Then it doesn't matter at which coordinate you sample the value. This might be useful if you want to use the average value in a custom CIColorKernel
.
Upvotes: 1
Reputation:
Something you can do that doesn't involve Metal is use CoreImage itself. Let's say you want a 640x640 image of the output from CIAreaAverage
that is called ciPixel
:
let crop = CIFilter(name: "CICrop")
crop?.setValue(ciPixel, forKey: "inputImage")
crop?.setValue(CIVector(x: 0, y: 0, z: 640, w: 640), forKey: "inputRectangle")
ciOutput = crop?.outputImage
Upvotes: 0