Hiago Prata
Hiago Prata

Reputation: 117

How to normalize colors acquiring a single color?

I have to build an algorithm that takes an RBG image and returns the image turned into a wood-like mosaic. For this, I was given some wood tablets samples as seen in the image below:

samples

I'd like to know how I can normalize the colors of each tablet, resulting in a single color, so I can build a map of reference colors to convert the input image colors to.

I've searched for how to achieve that, but I only found a Wikipedia article, but I couldn't understand much of it.

Thanks in advance for all help you might provide me.

PS: I'm considering using Python to develop this. So if you come up with something done using this language, I'd really appreciate it.

Upvotes: 1

Views: 1527

Answers (3)

Mark Ransom
Mark Ransom

Reputation: 308528

The way to get the average color is to simply take the average of the RGB values.

To get a more accurate average you should do this with linear color values. Usually RGB uses a gamma corrected value, but you can easily undo it then redo it once you have the average. Here's how you'd do it with Python's PIL using a gamma of 2.2:

def average_color(sample):
    pix = sample.load()
    totals = [0.0, 0.0, 0.0]
    for y in range(sample.size[1]):
        for x in range(sample.size[0]):
            color = pix[x,y]
            for c in range(3):
                totals[c] += color[c] ** 2.2
    count = sample.size[0] * sample.size[1]
    color = tuple(int(round((totals[c] / count) ** (1/2.2))) for c in range(3))
    return color

For the sample in the upper left of your examples, the result is (144, 82, 66). Here's a visual of all of them:

enter image description here enter image description here

Upvotes: 2

Cris Luengo
Cris Luengo

Reputation: 60809

One trivial way to normalize the colors is to simply force the mean and standard deviation of RGB values in all images to be the same.

Here is an example with the two panels at the top of the left column in the example image. I'm using MATLAB with DIPimage 3.0, because that is what I know, but this is trivial enough to implement in Python with NumPy, or any other desired language/library:

img = readim('https://i.sstatic.net/HK6VY.png')
tab1 = dipcrop; % Interactive cropping of a tile from the displayed image
tab2 = dipcrop;

m1 = mean(tab1);
s1 = std(tab1);
m2 = mean(tab2);
s2 = std(tab2);
tab2b = (tab2 - m2) ./ s2 .* s1 + m1;

the three images: tab1, tab2 and tab2b

What the code does to the image tab2 is, on a per-channel basis, to subtract the mean and divide by the standard deviation. Next, it multiplies each channel by the standard deviation of the corresponding channel of the template image, and adds the mean of that channel.

Upvotes: 0

Vaibhav Mehrotra
Vaibhav Mehrotra

Reputation: 426

To make one color represent a tile, a simple option would be to find the mean color of a random sample of pixels in a specific tile. You can choose an appropriate sample size as a trade-off between speed and accuracy.

For your specific use case, I'd recommend further division of tiles, say into 3 columns (because of the top-to-bottom design of most wood panels). Find the mean color of each column and eliminate any which is beyond a certain measure of variance. This is to try to ensure that tiles such as the right most one in the 4th row don't get mapped to the darker shade.

An alternate approach would be to convert both your input image and these wood tiles in to and carry out your processing in grayscale. The opencv library has various simple functions for RGB2GRAYconversions.

Upvotes: 0

Related Questions