HyunYoung Go
HyunYoung Go

Reputation: 103

grayscale image different in cv2.imshow() and matplotlib.pyplot.show()

import cv2 
import numpy as np 
import math
import sys
import matplotlib.pyplot as plt
import utils as ut

imgGray = cv2.imread(imgfile, cv2.IMREAD_GRAYSCALE)
plt.imshow(imgGray, cmap = 'gray')
plt.show() 


cv2.imshow("",imgGray)
cv2.waitKey(0)
cv2.destroyAllWindows()


sys.exit()

plt.show() result

enter image description here

cv2.imshow() result

enter image description here

I thought both of them would be same. But as you can see, two pictures have different grayscale. Seems plt.show() darker than cv2.imshow()

How do I have to make grayscale in plt.show() same as cv2.imshow()?

Python : 3.9.6

opencv-python : 4.5.3.56

mathplotlib : 3.4.3

Upvotes: 4

Views: 2937

Answers (1)

Christoph Rackwitz
Christoph Rackwitz

Reputation: 15354

This is the behavior of matplotlib. It finds the minimum and maximum of your picture, makes those black and white, and scales everything in between.

This is useful for arbitrary data that may have integer or floating point types, and value ranges between 0.0 and 1.0, or 0 .. 255, or anything else.

You can set those limits yourself with vmin and vmax arguments:

plt.imshow(imgGray, cmap='gray', vmin=0, vmax=255) # if your data ranges is uint8

OpenCV does no such auto-scaling. It has fixed rules. If it's floating point, 0.0 is black and 1.0 is white. If it's uint8, the range is 0 .. 255.

To get such auto-ranging in OpenCV, you'll have to scale the data before displaying:

normalized = cv.normalize(
    data, alpha=0.0, beta=1.0, norm_type=cv.NORM_MINMAX, dtype=cv.CV_32F)

Upvotes: 7

Related Questions