InterestingPenguin80
InterestingPenguin80

Reputation: 91

Why is cv2.resize() distorting my images?

I have the following image:

Original Image

I am using the following code to resize this image to 1600x1200.

img = cv2.imread('R.png')
gray_image = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
gray_image.resize(1600,1200)

I am then returned the following image:

Final Image

I have tried to fix this by using different image formats (jpg, tif), but this does not seem to help. I also tried using different interpolation algorithms like INTER_NEAREST and INTER_LINEAR, and these produce the same results.

Does anyone have an idea?

Upvotes: 0

Views: 343

Answers (1)

kaiffeetasse
kaiffeetasse

Reputation: 580

You are calling the resize() function on the numpy array that represents the grayscale image, which only changes the shape of the array. You should use the resize() function from OpenCV:

img = cv2.imread('R.png')
resized_image = cv2.resize(img, (1600, 1200), interpolation = cv2.INTER_LINEAR)

Besides of that, I think you have mistakenly swapped the width and height of the image, it should be 1200 x 1200 to keep the scale.

Upvotes: 1

Related Questions