azamsharp
azamsharp

Reputation: 20066

Using CIEdgeWork Filters in iOS

I am using Core Image filters and trying to make the CIEdgeWork filter. When the filter is applied the image turns black. Am I initializing the CIFilter correctly.

 CIFilter *edgeWork = [CIFilter filterWithName:@"CIEdgeWork"
                                       keysAndValues:kCIInputImageKey,filterPreviewImage,
                             @"inputRadius",[NSNumber numberWithFloat:3.0],
                             nil];

Upvotes: 1

Views: 2617

Answers (2)

Viktor Goltvyanitsa
Viktor Goltvyanitsa

Reputation: 167

Now CIEdgeWork and CILineOverlay available for iOS9

CIEdgeWork

Also you can use CoreImage filter Sobel Sketch, based on GPUImageSketchFilter. FWKSketchFilter

Kernel of it:

kernel vec4 sketch(sampler image, float strength){
vec2 d = destCoord();

vec2 bottomLeftTextureCoordinate = samplerTransform(image, d + vec2(-1.0, -1.0));
vec2 topRightTextureCoordinate = samplerTransform(image, d + vec2(1.0, 1.0));
vec2 topLeftTextureCoordinate = samplerTransform(image, d + vec2(-1.0, 1.0));
vec2 bottomRightTextureCoordinate = samplerTransform(image, d + vec2(1.0, -1.0));

vec2 leftTextureCoordinate = samplerTransform(image, d + vec2(-1.0, 0.0));
vec2 rightTextureCoordinate = samplerTransform(image, d + vec2(1.0, 0.0));
vec2 bottomTextureCoordinate = samplerTransform(image, d + vec2(0.0, -1.0));
vec2 topTextureCoordinate = samplerTransform(image, d + vec2(0.0, 1.0));

float bottomLeftIntensity = sample(image, bottomLeftTextureCoordinate).r;
float topRightIntensity = sample(image, topRightTextureCoordinate).r;
float topLeftIntensity = sample(image, topLeftTextureCoordinate).r;
float bottomRightIntensity = sample(image, bottomRightTextureCoordinate).r;

float leftIntensity = sample(image, leftTextureCoordinate).r;
float rightIntensity = sample(image, rightTextureCoordinate).r;
float bottomIntensity = sample(image, bottomTextureCoordinate).r;
float topIntensity = sample(image, topTextureCoordinate).r;

float h = -topLeftIntensity - 2.0 * topIntensity - topRightIntensity + bottomLeftIntensity + 2.0 * bottomIntensity + bottomRightIntensity;
float v = -bottomLeftIntensity - 2.0 * leftIntensity - topLeftIntensity + bottomRightIntensity + 2.0 * rightIntensity + topRightIntensity;

float mag = 1.0 - (length(vec2(h, v))*strength);

return vec4(vec3(mag), 1.0);}

Upvotes: 1

Brad Larson
Brad Larson

Reputation: 170319

CIEdgeWork is not available in Core Image on iOS as of iOS 5.x, so it's no surprise that you're seeing a black image when trying to use it.

However, you can use the GPUImageSketchFilter or GPUImageThresholdEdgeDetection from my GPUImage framework to pull off this same effect. You can see the result of the first filter in this answer. The latter filter might be closer to the actual effect that Apple supplies via CIEdgeWork, given that they seem to binarize the resulting edge detected image.

Upvotes: 3

Related Questions