Reputation: 2600
I'm trying to fix the white balance of a picture on iOS. In my app, people can take a picture and get 2 things : an OCR and an "improved" version of the image.
To do the OCR, I improve the image with GPUImage (using GPUImageAdaptiveThresholdFilter
). But that image is made only of black and white pixels. For my "improved" version, I want to have :
- the right color balance (meaning my white is really white and not yellow when I take the picture inside)
- good contrast.
I tried with GPUImageContrastFilter
and GPUImageWhiteBalanceFilter
. GPUImageWhiteBalanceFilter
works well but GPUImageWhiteBalanceFilter
takes parameters (like temperature
) and that really depends on the image input.
So is there a way to "calculate" these parameters or are there objective c algorithms to fix white balance ? Either ready to use, or simple using GPUImage ?
Thanks !
Upvotes: 0
Views: 704
Reputation: 9143
You can use CoreImage
to auto enhance images. It will do the analysis for you.
This is done by acquiring a set of CIFilter
with the necessary modifications that you can just apply to your image.
See more info here.
Upvotes: 1
Reputation: 89172
It may take time, but you can use your OCR function as a kind of scoring algorithm for a hill-climbing (or other) optimization algorithm.
If OCR is too slow -- you may have to pick something else to score with. Perhaps there's a way to compare your output images with known good ones and get a diff.
Upvotes: 0