Reputation: 6540
I'm trying to perform floodfill on UIImage that has opacity. I tried implementing a 4-Way floodfill with CG(http://en.wikipedia.org/wiki/Flood_fill) but reading the pixel colours and colouring pixels makes it too slow. It takes over a minute to fill a 300x100px area. I need a function like:
-(UIImage*)floodFillOnImage:(UIImage*)image fromPoint:(CGPoint)start;
that is fast enough.
Does anybody have a working implementation of or have an idea for a flood fill algorithm in Objective-C that works with UIImages? I've seen such a paint bucket tool in some iOS drawing apps.
Upvotes: 2
Views: 2863
Reputation: 96333
I'm trying to perform floodfill on UIImage that has opacity.
Are you trying to fill the areas that are clear? If so, then there's a much simpler solution: Create a context, fill it with the desired color, draw the image on top, and then extract your completed image from the context.
If you're trying to fill the areas that are not clear, then there's a solution for that, too: Create a context, draw the image into it first, set the context's blend mode to source in, fill with the color, and then extract the completed image.
Upvotes: 1
Reputation: 53551
If it takes over a minute to flood-fill a 300×100 image, your implementation of getting and setting pixel values is probably extremely inefficient. If you draw your image into a bitmap context and then access (and modify) the pixel data directly, it should be a lot faster.
The basic approach would be to allocate a memory buffer (a C array of 4 (rgba) × width × height bytes), pass that to CGBitmapContextCreate
, draw your source image into the context (e.g. using CGContextDrawImage
), and then operate directly on the buffer (the pixel values). When you're done, you can create an image from your context using CGBitmapContextCreateImage
.
Upvotes: 6