Reputation: 1
def AHE(img):
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
clahe = cv2.createCLAHE(clipLimit=2.0, tileGridSize=(8,8))
eq = clahe.apply(gray)
return eq
IMG_SIZE = (120,120)
batch_size = 8
epoch = 10
train_image_generator = ImageDataGenerator(rescale=1./119,rotation_range=30, horizontal_flip=0.5, preprocessing_function=AHE)
validation_image_generator = ImageDataGenerator(rescale=1./119)
test_image_generator = ImageDataGenerator(rescale=1./119,preprocessing_function=AHE)
train_data_gen = train_image_generator.flow_from_directory(batch_size=batch_size,
directory=train_dir,
shuffle=True,
target_size=IMG_SIZE,
)
val_data_gen = validation_image_generator.flow_from_directory(batch_size=batch_size,
directory=validate_dir,
shuffle=True,
target_size=IMG_SIZE,
)
test_data_gen = test_image_generator.flow_from_directory(batch_size=batch_size,
directory=test_dir,
shuffle=True,
target_size=IMG_SIZE,
)
sample_test_images, labels = next(test_data_gen)
print(labels[0:10])
sample_test_images.shape
labels.shape
Even Though I converted the image to gray scale I'm getting this ERROR: OpenCV(4.1.2) /io/opencv/modules/imgproc/src/clahe.cpp:351: error: (-215:Assertion failed) _src.type() == CV_8UC1 || _src.type() == CV_16UC1 in function 'apply'
Upvotes: 0
Views: 1080
Reputation: 56
I ran into just the same problem, exactly. What is asserted is significant here:
(-215:Assertion failed) _src.type() == CV_8UC1 || _src.type() == CV_16UC1
This says that the apply function is expecting to receive specific types, either "CV_8UCI" or "CV_16UCI." These correspond to np.unit8 or np.uint16, respectively, so changing the type of the input array resolves the error.
Another issue that then came up was that the shape of the image array was no longer of the same shape as the input to the preprocessing_function. According to the Keras documentation, "The function should take one argument: one image (Numpy tensor with rank 3), and should output a Numpy tensor with the same shape." To resolve this issue, I reconverted the image back to RGB color (3 channels). (BGR is native to OpenCV, but I changed to convert to and from RGB.) I also converted the array back to np.float32 type.
So your function may be along these lines:
clahe = cv2.createCLAHE(clipLimit=2.0, tileGridSize=(8,8))
def AHE(img):
gray = cv2.cvtColor(img, cv2.COLOR_RGB2GRAY)
gray = gray.astype(np.uint16)
eq = clahe.apply(gray)
eq = cv2.cvtColor(eq, cv2.COLOR_GRAY2RBG)
eq = eq.astype(np.float32)
return eq
BTW, with this added preprocessing, the time for each epoch has about tripled (GPU on a Colab notebook). Working with classifying chest x-rays at the moment, I hope the enhancement is worth the overhead. ;)
Upvotes: 1