Him
Him

Reputation: 5549

Create a temporary keras session?

I have a bunch of models floating around, I clone them, cross-validate them, do hyperparameter selection and what have you. As such, my keras global session can get quite mucked up. The solution per various threads is to call .clear_session(). However, this will throw away any models that I want to keep. One option is to train all of my models in a multiprocessing thread. However, it would be convenient to just instantiate a new session for each model as one might do with Tensorflow:

def score_model(**hyperparameters):
    with tf.Graph().as_default() 
        my_model = build_model(**hyperparameters)
        with tf.Session() as sess:
            my_model.train(X,y)
            score = my_model.score()
    # now it's all gone, I have the score, so I don't need the model anymore
    # the rest of my_model should get garbage collected, hooray!
    return score

Can I do this sort of thing with keras?

UPDATE

The sess.as_default() method is crashing my kernel. My memory does not seem to be running low, and it gives no error whatsoever. In the following loop I can't even make it to i=2 before crashing.

from sklearn.datasets import load_iris
import numpy as np
import sklearn
import keras
import keras.wrappers.scikit_learn
import tensorflow as tf
import keras.models
import os


def sessioned(f):
    def sessioned_f(self, *args, **kwargs):
        if not hasattr(self, "sess"):
            self.sess = tf.Session()
        with self.sess.as_default():
            return f(self, *args, **kwargs)
        return result
    return sessioned_f

class LogisticRegression(keras.wrappers.scikit_learn.KerasClassifier):   
    def __init__(self, n_epochs=100, **kwargs):
        self.n_epochs = n_epochs
        super().__init__(**kwargs)
    @sessioned
    def fit(self, X, y,**kwargs):
        # get the shape of X and one hot y
        self.input_shape = X.shape[-1]
        self.label_encoder = sklearn.preprocessing.LabelEncoder()
        self.label_encoder.fit(y)
        self.output_shape = len(self.label_encoder.classes_)
        label_encoded = self.label_encoder.transform(y).reshape((-1,1))
        y_onehot = sklearn.preprocessing.OneHotEncoder().fit_transform(label_encoded).toarray()
        super().fit(X,y_onehot,epochs=self.n_epochs,verbose=1,**kwargs)
        return self
    @sessioned
    def predict_proba(self, X):
        return super().predict_proba(X)
    def check_params(self, params):
        #fuckit
        pass
    @sessioned
    def __call__(self): # the build_fn thing
        # create model
        model = keras.models.Sequential()
        model.add(keras.layers.Dense(self.output_shape, input_dim=self.input_shape, kernel_initializer="normal", activation="softmax"))
        # Compile model
        model.compile(loss='categorical_crossentropy', optimizer='adam')
        return model

data = load_iris()
i=0
while True:
    print(i)
    graph = tf.Graph()
    with graph.as_default():
        model = LogisticRegression()
        model.fit(data.data, data.target)
        model.sess.close()
        del model
    i+=1
    del graph

Upvotes: 1

Views: 2085

Answers (1)

Jeremy Bare
Jeremy Bare

Reputation: 550

You can use Keras exactly as you described, except instead of running Tensorflow code inside the with statements you run the Keras code.

To set the session you would use

with sess.as_default()

Here is a link with with more information: https://blog.keras.io/keras-as-a-simplified-interface-to-tensorflow-tutorial.html

I have also found it helpful to look at the source code inside keras.backend. If you look at get_session() you can see that Keras first looks to see if there is a tensorflow default session. Otherwise it uses the session set to Keras using set_session(). Finally if no session has been set then it creates one.

Upvotes: 1

Related Questions