Reputation: 1
How can I pass an ImageDataGenerator to segmentation_model's U-net,
data_generator = ImageDataGenerator(
rescale = 1./255.
)
train_dataset_images = data_generator.flow_from_directory(
directory=image_directory,
target_size = (256, 256),
class_mode = None,
batch_size = 32,
seed=custom_seed
)
train_dataset_masks = data_generator.flow_from_directory(
directory=mask_directory,
target_size = (256, 256),
batch_size = 32,
class_mode = None,
color_mode = 'grayscale',
seed=custom_seed
)
train_generator = zip(train_dataset_images, train_dataset_masks)
When I run this i run into a valueError saying "exptected 1 input but received 2" so I tried combining them with these functions:
def combine_generator(image_generator, mask_generator):
while True:
image_batch = image_generator.next()
mask_batch = mask_generator.next()
yield (image_batch, mask_batch)
and
def combine_generator(image_gen, mask_gen):
for img, mask in zip(image_gen, mask_gen):
yield img, mask
These don't seem to work.
Upvotes: 0
Views: 37
Reputation: 169
Create a subfolders images and masked images inside the train directory.Pass
it to image_datagen.flow_from_directory()
.Masks images are generally
grayscale,create a generator that yields batch of images and masks .Create a
U-net
model and pass the generator to fit function, masks need not be one
hot encoded. i have build U-net please refer to this gist
Upvotes: 0