Reputation: 11
I'm writing a simple program that finds the brightest pixel in an image which in the future with be implemented it something that finds the brightest pixel of a video frame. With small images it works fine. On my 8x8 test image thats all black with one white pixel it can seemingly realtime find the white pixel however when I upped to to a 1000x1000 image it takes several seconds find it. My goal is to be able to have it locate it 15+ times a second on something higher than 1000x1000. Is this even possible? Here's the code I'm using.
//These are at the beginning of the class
static NSBitmapImageRep *imageRepStatic;
static float brightestBrightness;
static CGPoint brightest;
//This is in my function for getting the pixel
for (int y = 0; y < imageRepStatic.size.height; y++) {
for (int x = 0; x < imageRepStatic.size.width; x++) {
NSColor *color = [imageRepStatic colorAtX:x y:y];
NSArray *pixelData = [[NSString stringWithFormat:@"%@", color] componentsSeparatedByString:@" "];
float red = [[pixelData objectAtIndex:1] floatValue];
float green = [[pixelData objectAtIndex:2] floatValue];
float blue = [[pixelData objectAtIndex:3] floatValue];
float brightness = (red + green + blue) / 3;
if (brightness >= brightestBrightness) {brightestBrightness = brightness; brightest = CGPointMake(x, y);}
}
}
NSLog(@"The brightest pixel is at (%f, %f) and has a brightness of %f", brightest.x, brightest.y, brightestBrightness);
frame ++;
NSLog(@"%i", frame);
Upvotes: 0
Views: 69