Jivan
Jivan

Reputation: 23038

Loading models in Keras takes exponentially longer

I have a series of Keras models saved in hdf5 format (including both structure and weights). These models are based on pre-trained DenseNet121 from keras.applications, and have been further fine-tuned with custom datasets.

For production use, I need to have all these models loaded in memory at the same time.

from keras.models import load_model

model_names = ['birds', 'cats', 'dogs', 'phones']
models = dict()

for name in model_names:
    path = 'models/{}.h5'.format(name)
    m = load_model(path)
    models[name] = m

Loading time seems to get exponentially longer the more models have already been loaded. Indicative values are:

All models are based on the same structure and each h5 file takes 82Mb on disk. I'm running this on an AWS p2.xlarge instance equipped with one single GPU.

Questions:

Upvotes: 5

Views: 405

Answers (1)

Daniel Möller
Daniel Möller

Reputation: 86600

This is not a proven answer, I'm detailing it here from the comment above for you to test.

Join the 4 models into a single one.

How to do that?

Load them and wait all that time (this is still not production).

Now:

common_input = Input(compatible_shape)  #assuming all models have similar inputs

outputs = []
for name in models:
    outputs.append(models[name](common_input))

common_model = Model(common_input, outputs)

Save this common_model and see how much time it takes to load in a new session.

You can retrieve each model from it with common_model.layers[i]. See in the summary which i is which layer. If you defined your submodels with names, it's easier: common_model.get_layer(model_name).

Upvotes: 1

Related Questions