Reputation: 123
i have a large dataset of images on which i want to run the predict_generator. i cannot run on all of them at once due to memory issues. the idea is to
feeding small set of images iteratively to generator by looping over the range of images and making predictions for it.
saving the predictions to a file
later opening the file in a loop to read all the prediction for calculating the probabilites as mentioned in the code.
validation_generator = ImageDataGenerator(rescale=1./255).flow_from_directory(path, target_size=(img_width, img_height),
batch_size=6,shuffle=False)
print("generator built")
print (counter)
#file = open('Failed.py', 'w')
#for x in file:
# for i in range(counter):
# features = model.predict_generator(validation_generator,steps=2)
print("features found")
model = Sequential()
model.add(Flatten(input_shape=(3, 3, 1536)))
model.add(Dense(256, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(6, activation='softmax'))
model.load_weights(top_model_weights_path)
print("top model loaded")
prediction_proba = model.predict_proba(features)
prediction_classes = model.predict_classes(features)
print(prediction_proba)
print(prediction_classes)
print("original file names")
print(validation_generator.filenames)
the question is that how should the different prediction be saved in one single file. i have tried creating a for loop for file but not sure how it should work? it would be nice if some one could give hints for the goals defined.
Upvotes: 0
Views: 1310
Reputation: 86600
Predicting and saving
i = 0
maximumPredictions = ??
for x,y in generator: #if the generator doesn't have y, use only "for x in..."
predictions = model.predict(x)
numpy.save('predictions/prediction' + str(i) +".npy", predictions)
i+=1
if i == maximumPredictions:
break;
Loading and processing
files = [name for name in os.listdir('predictions')]
for file in files:
prediction = numpy.load('predictions/"+file)
#do what you want with the loaded predictions.
Upvotes: 2