Reputation: 123
Let's say we have an Image with [1..n] pixels of the format (r,g,b).
We can now choose X colors also of the format (r,g,b) to generate a second same sized image. Each of these X colors will represent n/x freely choosen pixels of our new image.
Now we compute the square deviation from the original image to the new generated image by going over each pixel-pair from [1..n] on both images and summing up their color-differences sqrt(sqr(R1-R2)+sqr(G1-G2)+sqr(B1-B2)).
Now let's assume we distributed all of our X colors so that this sum of square deviations is minimal.
My question here is: How to find the perfect X colors so after perfect distribution our square deviation is minimal compared to other color-sets?
If some information is missing or unclear (due to my bad english), please let me know.
Thank You!
PS: For better understanding I added an example with X = 4. The distribution of colors is nearly optimal, but I'm sure I didn't choose the perfect 4 colors to represent the original image.
Unfortunately I don't have the reputation for posting an image, so I added the link to it instead Before-After example for X=4
Upvotes: 0
Views: 160
Reputation: 15388
The problem is called color quantization, i.e. given n
input colors and given at most m < n
output colors, find the best assignment of color values such that the total color error sum is minimal.
Optimal quantization is NP hard, roughly meaning that there is no known algorithm that works in the general case and does not have to try all possible combinations.
There is --however-- a variety of approximation algorithms, most of which are based on clustering algorithms, since the problems is essentially equal to positioning m
points in a field of n
points such that the sum of the distance between the (few) m
points and all the (many) n
points is minimal.
Note also that using the Euclidean Distance in RGB space is in no way a good measurement to determine colorimetric color difference.
Upvotes: 2