Reputation: 123
Currently I am feeding all the images at once to predict_generator. I want to be able to feed small set of images which are being stored in the validation_generator and make predictions on them so that there are no memory issues with large datasets. How should I change the following code?
top_model_weights_path = '/home/rehan/ethnicity.071217.23-0.28.hdf5'
path = "/home/rehan/countries/pakistan/guys/"
img_width, img_height = 139, 139
confidence = 0.8
model = applications.InceptionResNetV2(include_top=False, weights='imagenet',
input_shape=(img_width, img_height, 3))
print("base pretrained model loaded")
validation_generator = ImageDataGenerator(rescale=1./255).flow_from_directory(path, target_size=(img_width, img_height),
batch_size=32,shuffle=False)
print("validation_generator")
features = model.predict_generator(validation_generator,steps=10)
Upvotes: 0
Views: 1050
Reputation: 123
i ran a loop over the object and then stored the data in a list to get rid of memory issues.
validation_generator= ImageDataGenerator(rescale=1./255).flow_from_directory(path, target_size=(img_width, img_height),
batch_size=32,shuffle=False)
prediction_proba1=[]
prediction_classes1=[]
print("validation_generator")
print(len(validation_generator))
for i in range(len(validation_generator)):
print (" array coming...")
#print(validation_generator[i])
kl = validation_generator[i]
print(kl)
print("numpy array")
print(kl[0])
features = model.predict_on_batch(kl[0])
print("features")
print(features)
prediction_proba = model1.predict_proba(features)
prediction_classes = model1.predict_classes(features)
prediction_classes1.extend(prediction_classes)
prediction_proba1.extend(prediction_proba)
#print(prediction_proba1)
print(prediction_classes1)
Upvotes: 1