Reputation: 71
I am working with 388 3D MRI images that are too big to fit the memory available when training a CNN model, thus I have chose to create a generator that takes in batches of images into memory to be trained at a time and combine it with a custom ImageDataGenerator for 3D images (downloaded for github). I am trying to predict a single test score (of range 1-30) using an MRI image. I have the following generator code that I am not sure if it is correct:
x = np.asarray(img)
y = np.asarray(scores)
def create_batch(x, y, batch_size):
x, y = shuffle(x, y)
x_split, x_val, y_split, y_val = train_test_split(x, y, test_size=.05, shuffle=True)
x_batch, x_test, y_batch, y_test = train_test_split(x_split, y_split, test_size=.05, shuffle=True)
x_train, y_train = [], []
num_batches = len(x_batch)//batch_size
for i in range(num_batches):
x_train.append([x_batch[0:batch_size]])
y_train.append([y_batch[0:batch_size]])
return x_train, y_train, x_val, y_val, x_batch, y_batch, x_test, y_test, num_batches
epochs = 1
model = build_model(input_size)
x_train, y_train, x_val, y_val, x_batch, y_batch, x_test, y_test, num_batches = create_batch(x, y, batch_size)
train_datagen = customImageDataGenerator(shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
val_datagen = customImageDataGenerator()
validation_set = val_datagen.flow(x_val, y_val, batch_size=batch_size, shuffle=False)
def generator(batch_size, epochs):
for e in range(epochs):
print('Epoch', e+1)
batches = 0
images_fitted = 0
for i in range(num_batches):
training_set = train_datagen.flow(x_train[i][0], y_train[i][0], batch_size=batch_size, shuffle=False)
images_fitted += len(x_train[i][0])
total_images = len(x_batch)
print('number of images used: %s/%s' % (images_fitted, total_images))
history = model.fit_generator(training_set,
steps_per_epoch = 1,
#callbacks = [earlystop],
validation_data = validation_set,
validation_steps = 1)
model.load_weights('jesse_weights_13layers.h5')
batches += 1
yield history
if batches >= num_batches:
break
return model
def train_load_weights():
history = generator(batch_size, epochs)
for e in range(epochs):
for i in range(num_batches):
print(next(history))
model.save_weights('jesse_weights_13layers.h5')
for i in range(1):
print('Run', i+1)
train_load_weights()
I am not sure if the generator was built correctly or if the model is being trained correctly and don't know how to check if it is. If anyone has any advice, I would appreciate it! The code runs and here is a portion of the training:
Run 1
Epoch 1
number of images used: 8/349
Epoch 1/1
1/1 [==============================] - 156s 156s/step - loss: 8.0850 - accuracy: 0.0000e+00 - val_loss: 10.8686 - val_accuracy: 0.0000e+00
<keras.callbacks.callbacks.History object at 0x00000269A4B4E848>
number of images used: 16/349
Epoch 1/1
1/1 [==============================] - 154s 154s/step - loss: 4.3460 - accuracy: 0.0000e+00 - val_loss: 4.5994 - val_accuracy: 0.0000e+00
<keras.callbacks.callbacks.History object at 0x0000026899A96708>
number of images used: 24/349
Epoch 1/1
1/1 [==============================] - 148s 148s/step - loss: 4.1174 - accuracy: 0.0000e+00 - val_loss: 4.6038 - val_accuracy: 0.0000e+00
<keras.callbacks.callbacks.History object at 0x00000269A4F2F488>
number of images used: 32/349
Epoch 1/1
1/1 [==============================] - 151s 151s/step - loss: 4.2788 - accuracy: 0.0000e+00 - val_loss: 4.6029 - val_accuracy: 0.0000e+00
<keras.callbacks.callbacks.History object at 0x00000269A4F34D08>
number of images used: 40/349
Epoch 1/1
1/1 [==============================] - 152s 152s/step - loss: 3.9328 - accuracy: 0.0000e+00 - val_loss: 4.6057 - val_accuracy: 0.0000e+00
<keras.callbacks.callbacks.History object at 0x00000269A4F57848>
number of images used: 48/349
Epoch 1/1
1/1 [==============================] - 154s 154s/step - loss: 3.9423 - accuracy: 0.0000e+00 - val_loss: 4.6077 - val_accuracy: 0.0000e+00
<keras.callbacks.callbacks.History object at 0x00000269A4F4D888>
number of images used: 56/349
Epoch 1/1
1/1 [==============================] - 160s 160s/step - loss: 3.7610 - accuracy: 0.0000e+00 - val_loss: 4.6078 - val_accuracy: 0.0000e+00
<keras.callbacks.callbacks.History object at 0x00000269A4F3E4C8>
number of images used: 64/349
Upvotes: 3
Views: 1952
Reputation: 2761
Not sure how your directory is structured, but if it is like this:
|---train
|------class1
|---------1.jpg
|---------2.jpg
|------class2
|---------3.jpg
|..........
|---test
|----label
|---------t1.jpg
|---------t2.jpg
NOTE: there is a subfolder after "test"
Then this is how to use ImageDataGenerator:
generator = ImageDataGenerator(..., validation_split=...) # for train and valid, augment data here too
train_gen = generator.flow_from_directory("<path_to_train>/train", batch_size=...,target_size=..., subset="training")
valid_gen = generator.flow_from_directory("<path_to_train>/train", batch_size=...,target_size=..., subset="validation)"
test_generator = ImageDataGenerator(...) # no validation split
test_gen = test_generator.flow_from_directory("<path_to_test>/test", class_mode="None",...)
Then just call:
model.fit(train_gen, validation_data=valid_gen,...)
model.predict(test_gen)
Upvotes: 1