0xPrateek
0xPrateek

Reputation: 1188

Custom ImageDataGenerator keras

I've been trying to implement Keras custom imagedatagenerator so that I can do hair and microscope image augmentation.

This is the Datagenerator class:

class DataGenerator( Sequence ):

    def __init__(self,image_paths,labels, augmentations, batch_size=32, image_dimension=(224,224,3), shuffle=False):
        self.image_paths = image_paths
        self.labels = labels
        self.batch_size = batch_size
        self.image_dimension = image_dimension
        self.shuffle = shuffle
        self.augment = augmentations

    def __len__(self):
        return int(np.ceil(len(self.image_paths) / self.batch_size ))

    def _getitem__(self,index):
        indexes = self.indexes[index*self.batch_size : (index+1)*self.batch_size]
        batch_y = np.array([self.labels[k] for k in indexes])
        batch_x = [cv2.cvtColor(cv2.imread(self.image_paths[k]), cv2.COLOR_RGB2BGR) for k in indexes]

        return np.stack([
            self.augment(image=x)["image"] for x in batch_x
        ], axis=0), np.array(batch_y)

Below Code is for albumentations augmentation (Just trying albualbumentations augmentation to test if the data generator works or not):

AUGMENTATIONS_TRAIN = Compose([
    HorizontalFlip(p=0.5),
    RandomContrast(limit=0.2, p=0.5),
    RandomGamma(gamma_limit=(80, 120), p=0.5),
    RandomBrightness(limit=0.2, p=0.5),
    HueSaturationValue(hue_shift_limit=5, sat_shift_limit=20,
                       val_shift_limit=10, p=.9),
    # CLAHE(p=1.0, clip_limit=2.0),
    ShiftScaleRotate(
        shift_limit=0.0625, scale_limit=0.1, 
        rotate_limit=15, border_mode=cv2.BORDER_REFLECT_101, p=0.8), 
    ToFloat(max_value=255)
])

    AUGMENTATIONS_TEST = Compose([
    # CLAHE(p=1.0, clip_limit=2.0),
    ToFloat(max_value=255)
])

Now creating DataGenerator object :

train_datagen = DataGenerator(  train['images'],
                                train['target'],
                                augmentations=AUGMENTATIONS_TRAIN,
                                batch_size=32,
                                image_dimension=(224,224,3) )
val_datagen = DataGenerator(   validation['images'],
                               validation['target'],
                               augmentations=AUGMENTATIONS_TEST,
                               batch_size=16,
                               image_dimension=(224,224,3) )`

A NonImplementedError comes when i run model.fit_generator(generator=train_datagen,steps_per_epoch=30,epochs = 30,validation_data=val_datagen,validation_steps=15)

I have shared my kernel here and I was taking help from here. I have also looked for other ways to augment which were all the same.

I will be thankful if someone can tell why and where is the problem ? and Is there is any other good way to do custom image augmentation in keras.

Upvotes: 3

Views: 3199

Answers (1)

Akash Kumar
Akash Kumar

Reputation: 483

You can have a look at imgaug library. albumentations and imgaug are same almost. Write the sequence of operations and then just put it in Imagedatagenerator preprocessing_function. I tried using albumentations library but faced some errors.

from imgaug import augmenters as iaa

seq = iaa.Sequential([
    iaa.Fliplr(0.5), # horizontally flip
    # sometimes(iaa.AdditiveGaussianNoise(loc=0, scale=(0.0, 0.05), per_channel=0.5)),
    iaa.OneOf([
        iaa.Sharpen(alpha=(0, 1.0), lightness=(0.75, 1.5)),
        iaa.Emboss(alpha=(0, 1.0), strength=(0, 2.0)),
        # iaa.Noop(),
        iaa.GaussianBlur(sigma=(0.0, 1.0)),
        # iaa.Noop(),
        iaa.Affine(rotate=(-10, 10), translate_percent={"x": (-0.25, 0.25)}, mode='symmetric', cval=(0)),
        # iaa.Noop(),
        # iaa.PerspectiveTransform(scale=(0.04, 0.08)),
        # # iaa.Noop(),
        # iaa.PiecewiseAffine(scale=(0.05, 0.1), mode='edge', cval=(0)),
        
    ]),
    sometimes(iaa.ElasticTransformation(alpha=(0.5, 3.5), sigma=0.25)),
    # More as you want ...
], random_order=True)

datagen = ImageDataGenerator(preprocessing_function=seq.augment_image)

There are some advanced data augmentation practices such as cutout, random-erasing and mixup. They are easy to implement in Keras. For mixup, the example is below:

training_generator = MixupGenerator(trainX, trainY, batch_size=8, alpha=0.2, datagen=datagen)()
x, y = next(training_generator)

# To visualize the batch images
for i in range(9):
    plt.subplot(330+1+i)
    # batch = it.next()
    img = x[i]
    plt.imshow(img.reshape(224, 224, 3))
plt.savefig("mixup_batch.png")

H = model.fit_generator(
    # datagen.flow(trainX, trainY, batch_size=args.batch_size),
    training_generator,
    steps_per_epoch=len(trainX) // args.batch_size,
    validation_data=(valX, valY),
    validation_steps=len(valX) // args.batch_size,
    epochs=args.epochs,
    # workers=4,
    callbacks=[model_checkpoint, lr_reducer, stopping, lr_schedule],
)

The problem I faced in this though is that for random erasing, we need to put that in ImageDataGenerator preprocessing_function and we have already put the imgaug augmentation in that. The possible alternative is to use two data generators maybe.

Upvotes: 2

Related Questions