Sebastián Mayorga
Sebastián Mayorga

Reputation: 151

Convert an image format from 32FC1 to 16UC1

I need to encode an image in 16UC1 format, but I receive the error: cv_bridge.core.CvBridgeError:encoding specified as 16UC1, but image has incompatible type 32FC1

I tried to use skimage function img_as_uint but since my image values are not between -1 and 1 it doesn't work. i also tried to "normalize" my values by dividing all of them by the value obtained from np.amax, but using the skimage function only returns a blank image.

Is there a way of achieving this conversion?

This is the original 32FC1 image

Upvotes: 0

Views: 2480

Answers (1)

shortcipher3
shortcipher3

Reputation: 1380

With numpy you should be able to:

import numpy as np
img = np.random.normal(0, 1, (300, 300, 3)).astype(np.float32) # simulated image
uimg = img.astype(np.uint16)

You probably will first want to do some kind of normalization if it isn't already in an unsigned range. Probably something like:

img_normalized = (img-img.min())/(img.max()-img.min())*256**2

But your normalization strategy will depend on what you want to accomplish.


Thanks for sharing an image. I can visualize as follows:

import numpy as np
import matplotlib.pyplot as plt

arr = np.load('32FC1_image.npz')
img = arr['arr_0']
img = np.squeeze(img) # this gets rid of the extra dimensions that are causing matplotlib to not recognize it as an image, the extra dimensions also may be causing your problems
img_normalized = (img-img.min())/(img.max()-img.min())*256**2
img_normalized = img_normalized.astype(np.uint16)
plt.imshow(img_normalized)

Try using the normalized 16 bit image.

Upvotes: 1

Related Questions