Wen
Wen

Reputation: 103

Python PIL.Image.convert not replacing color with the closest palette.

This is a sort-of follow-up question from :Convert image to specific palette using PIL without dithering

I, too, want to create a script that can convert an image to a specific set of colors without dithering.

I have implemented the work-around "custom quantization" function given as the answer for the questions. Most of the scripts works well except for 1 big problem.

Light green color RGB(130,190,40) is replaced by a light brown color RGB(166, 141, 95). (see the light green on the top left of the mane.)

from PIL import Image

def customConvert(silf, palette, dither=False):
    ''' Convert an RGB or L mode image to use a given P image's palette.
        PIL.Image.quantize() forces dither = 1. 
        This custom quantize function will force it to 0.
        https://stackoverflow.com/questions/29433243/convert-image-to-specific-palette-using-pil-without-dithering
    '''

    silf.load()

    # use palette from reference image made below
    palette.load()
    im = silf.im.convert("P", 0, palette.im)
    # the 0 above means turn OFF dithering making solid colors
    return silf._new(im)

palette = [ 
    0,0,0,
    0,0,255,
    15,29,15,
    26,141,52,
    41,41,41,
    65,105,225,
    85,11,18,
    128,0,128,
    135,206,236,
    144,238,144,
    159,30,81,
    165,42,42,
    166,141,95,
    169,169,169,
    173,216,230,
    211,211,211,
    230,208,122,
    245,245,220,
    247,214,193,
    255,0,0,
    255,165,0,
    255,192,203,
    255,255,0,
    255,255,255
    ] + [0,] * 232 * 3


# a palette image to use for quant
paletteImage = Image.new('P', (1, 1), 0)
paletteImage.putpalette(palette)


# open the source image
imageOrginal = Image.open('lion.png').convert('RGB')

# convert it using our palette image
imageCustomConvert = customConvert(imageOrginal, paletteImage, dither=False).convert('RGB')

CIE76 Delta-E:

Currently: RGB(130,190,40) --> RGB(166, 141, 95) = 57.5522

Expected: RGB(130,190,40) --> RGB(144,238,144) = 31.5623


Can someone explain if I wrote the code incorrectly or suggestions how to get it to work.

Original Image Custom Convert

Upvotes: 3

Views: 4455

Answers (2)

Mark Setchell
Mark Setchell

Reputation: 207455

I had a try at calculating the CIE76 Delta-E function for each pixel to get the nearest colour. Python is not my best language so you may want to ask another question to get the code optimised if it works how you expect.

I basically convert the input image and the palette into Lab colourspace, then compute the CIE76 Delta-E value squared from each pixel to each of the palette entries and take the nearest one.

#!/usr/bin/env python3

import numpy as np
from PIL import Image
from skimage import color

def CIE76DeltaE2(Lab1,Lab2):
    """Returns the square of the CIE76 Delta-E colour distance between 2 lab colours"""
    return (Lab2[0]-Lab1[0])*(Lab2[0]-Lab1[0]) + (Lab2[1]-Lab1[1])*(Lab2[1]-Lab1[1]) + (Lab2[2]-Lab1[2])*(Lab2[2]-Lab1[2])

def NearestPaletteIndex(Lab,palLab):
    """Return index of entry in palette that is nearest the given colour"""
    NearestIndex = 0
    NearestDist   = CIE76DeltaE2(Lab,palLab[0,0])
    for e in range(1,palLab.shape[0]):
        dist = CIE76DeltaE2(Lab,palLab[e,0])
        if dist < NearestDist:
            NearestDist = dist
            NearestIndex = e
    return NearestIndex

palette = [ 
    0,0,0,
    0,0,255,
    15,29,15,
    26,141,52,
    41,41,41,
    65,105,225,
    85,11,18,
    128,0,128,
    135,206,236,
    144,238,144,
    159,30,81,
    165,42,42,
    166,141,95,
    169,169,169,
    173,216,230,
    211,211,211,
    230,208,122,
    245,245,220,
    247,214,193,
    255,0,0,
    255,165,0,
    255,192,203,
    255,255,0,
    255,255,255
    ] + [0,] * 232 * 3


# Load the source image as numpy array and convert to Lab colorspace
imnp = np.array(Image.open('lion.png').convert('RGB'))
imLab = color.rgb2lab(imnp) 
h,w = imLab.shape[:2]

# Load palette as numpy array, truncate unused palette entries, and convert to Lab colourspace
palnp = np.array(palette,dtype=np.uint8).reshape(256,1,3)[:24,:]
palLab = color.rgb2lab(palnp)

# Make numpy array for output image
resnp = np.empty((h,w), dtype=np.uint8)

# Iterate over pixels, replacing each with the nearest palette entry
for y in range(0, h):
    for x in range(0, w):
        resnp[y, x] = NearestPaletteIndex(imLab[y,x], palLab)

# Create output image from indices, whack a palette in and save
resim = Image.fromarray(resnp, mode='P')
resim.putpalette(palette)
resim.save('result.png')

I get this:

enter image description here


It seems slightly faster and more succinct to use scipy.spatial.distance's cdist() function:

#!/usr/bin/env python3

import numpy as np
from PIL import Image
from skimage import color
from scipy.spatial.distance import cdist

palette = [ 
    0,0,0,
    0,0,255,
    15,29,15,
    26,141,52,
    41,41,41,
    65,105,225,
    85,11,18,
    128,0,128,
    135,206,236,
    144,238,144,
    159,30,81,
    165,42,42,
    166,141,95,
    169,169,169,
    173,216,230,
    211,211,211,
    230,208,122,
    245,245,220,
    247,214,193,
    255,0,0,
    255,165,0,
    255,192,203,
    255,255,0,
    255,255,255
    ] + [0,] * 232 * 3


# Load the source image as numpy array and convert to Lab colorspace
imnp  = np.array(Image.open('lion.png').convert('RGB'))
h,w   = imnp.shape[:2]
imLab = color.rgb2lab(imnp).reshape((h*w,3))

# Load palette as numpy array, truncate unused palette entries, and convert to Lab colourspace
palnp = np.array(palette,dtype=np.uint8).reshape(256,1,3)[:24,:]
palLab = color.rgb2lab(palnp).reshape(24,3)

# Make numpy array for output image
resnp = np.empty(h*w, dtype=np.uint8)

# Iterate over pixels, replacing each with the nearest palette entry
x = 0
for L in imLab:
    resnp[x] = cdist(palLab, L.reshape(1,3), metric='seuclidean').argmin()
    x = x +1

# Create output image from indices, whack the palette in and save
resim = Image.fromarray(resnp.reshape(h,w), mode='P')
resim.putpalette(palette)
resim.save('result.png')

Upvotes: 1

Mark Setchell
Mark Setchell

Reputation: 207455

ImageMagick can do this much faster, if speed is the issue. It is installed on most Linux distros and is available for macOS and Windows.

Basically you would create a 24x1 image, called "map.png", with one pixel of each colour in your palette, and tell ImageMagick to remap your lion image to that colormap in the Lab colourspace without dithering. So, the command in Terminal/Command Prompt would be:

magick lion.png +dither -quantize Lab -remap map.png result.png

That runs in under 0.3 seconds. If you wanted to do that from Python, you could shell out like this:

#!/usr/bin/env python3

import subprocess
import numpy as np
from PIL import Image

palette = [ 
    0,0,0,
    0,0,255,
    15,29,15,
    26,141,52,
    41,41,41,
    65,105,225,
    85,11,18,
    128,0,128,
    135,206,236,
    144,238,144,
    159,30,81,
    165,42,42,
    166,141,95,
    169,169,169,
    173,216,230,
    211,211,211,
    230,208,122,
    245,245,220,
    247,214,193,
    255,0,0,
    255,165,0,
    255,192,203,
    255,255,0,
    255,255,255
    ] + [0,] * 232 * 3


# Write "map.png" that is a 24x1 pixel image with one pixel for each colour
entries = 24
resnp   = np.arange(entries,dtype=np.uint8).reshape(24,1)
resim = Image.fromarray(resnp, mode='P')
resim.putpalette(palette)
resim.save('map.png')

# Use Imagemagick to remap to palette saved above in 'map.png'
# magick lion.png +dither -quantize Lab -remap map.png result.png
subprocess.run(['magick', 'lion.png', '+dither', '-quantize', 'Lab', '-remap', 'map.png', 'result.png'])

enter image description here

Upvotes: 3

Related Questions