Reputation: 75
I used a color checker image -24 to calculate color correction matrix. I used the colour-science package in Python. First, I extract the card image with colour_checker_detection module (detect_colour_checkers_segmentation). I then used the 24 colour swatches along with colour.CCS_COLOURCHECKERS['ColorChecker24 - After November 2014'] to calculate colour corrected checker card image. I have done the cctf_decoding and cctf_encoding before and after performing colour correction. For Cheung_2004 with 3 terms, there is no apparent clipping, but it becomes serious matter when applying Cheung_2004 20 terms. So is the Finalyson_2015, but not as bad as Cheung_2004 with 20 terms. [![most of the swatches displayed certain amount of clipping][1]][1] the original image was captured with no saturation and white balanced.
My question is how to minimize such clipping effect on colour correction images? Did I miss a step? Is there anything that I did wrong or the images were not ideal to work with? [1]: https://i.sstatic.net/Z726s.png
Upvotes: 0
Views: 178
Reputation: 4090
It is hard to confirm without seeing the original data but as you use higher order functions, you will also be fitting better the noise in your image.
The image you sent seems to exhibit quite a bit of it, I would recommend to either denoise it or average multiple images, i.e. stacking.
Something important to also consider Is that higher order functions tend to behave very well in the training dataset but they are subject to explosions outside of it, i.e. high order polynomials do not extrapolate well.
Upvotes: 1