Reputation: 623
I'm facing some troubles for creating tf.data.Dataset
using image_dataset_from_directory
for one to one task. That means I will give a model an input image and output will be another image.
My dataset directory is like this:
Dataset/
...input/
......a_image_1.jpg
......a_image_2.jpg
...output/
......a_image_1.jpg
......a_image_2.jpg
In the dataset directory corresponding input images and target images has same name. I'm trying to load dataset in following way:
dataset_url = "Project/Dataset"
input_size= 300
batch_size = 8
train_ds = image_dataset_from_directory(
dataset_url,
labels='inferred',
batch_size=batch_size,
image_size=(input_size, input_size),
validation_split=0.2,
subset="training",
seed=1337,
label_mode='int',
)
valid_ds = image_dataset_from_directory(
dataset_url,
labels='inferred',
batch_size=batch_size,
image_size=(input_size, input_size),
validation_split=0.2,
subset="validation",
seed=1337,
label_mode='int',
)
This process loads all the images in two folders as class 1 and 2. Now how can I map two classes as input
and target
? Am I in the right track? Is there any other way?
Upvotes: 2
Views: 2053
Reputation: 8112
Create a directory data. Within data create two subdirectories image, target. Within the image directory place your images. Within the target directory place your target images. Make sure your images and target images have EXECATLY the same filenames. This is needed so that when a batch of images is fetched its corresponding target images are fetched in the same order. I have used ImageDataGenerator.flow_from_directory as follows:
image_dir=r'c:\data\image'
target_dirr'c:\data\target'
target_size=(224,224) # set this to the target size you want
channels=3 # for color images
color_mode='rgb'
shuffle=True,
seed=123
class_mode=None
batch_size=10 # set this to desired batch size
vsplit=.2 # set this to the validation split you want
gen=ImageDataGenerator(rescale=1/255, validation_split=vsplit)
image_gen=gen.flow_from_directory(img_dir, target_size=target_size, color_mode=color_mode,class_mode=class_mode,seed=seed,
batch_size=batch_size, subset='training')
valid_image_gen=gen.flow_from_directory(img_dir, target_size=target_size, color_mode=color_mode,class_mode=class_mode,seed=seed,
batch_size=batch_size, subset='validation')
target_gen=gen.flow_from_directory(target_dir,target_size=target_size, color_mode=color_mode,class_mode=class_mode,seed=seed,
batch_size=batch_size, subset='training')
valid_target_gen=gen.flow_from_directory(target_dir,target_size=target_size, color_mode=color_mode,class_mode=class_mode,seed=seed,
batch_size=batch_size, subset='validation')
composite_gen=zip(image_gen, target_gen)
valid_gen=zip(valid_image_gen, valid_target_gen)
The composite_gen will yield tuples of (image, target image). To test it
images, targets=next(composite_gen)
print (images.shape, targets.shape)
img1= images[0]
target1=targets[0]
# show these two images to ensure the image and targets are matched as required
plt.subplot(1,2,1)
plt.imshow(img1)
plt.subplot(1,2,2)
plt.imshow(target1)
You can run the same test on the valid_gen. Then use the composite_gen and valid_gen as inputs to model.fit
Upvotes: 2