Euler_Salter
Euler_Salter

Reputation: 3561

How to obtain the Tensorflow code version of a NN built in Keras?

I have been working with Keras for a week or so. I know that Keras can use either TensorFlow or Theano as a backend. In my case, I am using TensorFlow.

So I'm wondering: is there a way to write a NN in Keras, and then print out the equivalent version in TensorFlow?

MVE

For instance suppose I write

    #create seq model
    model = Sequential()
    # add layers
    model.add(Dense(100, input_dim = (10,), activation = 'relu'))
    model.add(Dense(1, activation = 'linear'))
    # compile model
    model.compile(optimizer = 'adam', loss = 'mse')
    # fit
    model.fit(Xtrain, ytrain, epochs = 100, batch_size = 32)
    # predict
    ypred = model.predict(Xtest, batch_size = 32)
    # evaluate
    result = model.evaluate(Xtest)

This code might be wrong, since I just started, but I think you get the idea.

What I want to do is write down this code, run it (or not even, maybe!) and then have a function or something that will produce the TensorFlow code that Keras has written to do all these calculations.

Upvotes: 2

Views: 386

Answers (2)

charlesreid1
charlesreid1

Reputation: 4821

First, let's clarify some of the language in the question. TensorFlow (and Theano) use computational graphs to perform tensor computations. So, when you ask if there is a way to "print out the equivalent version" in Tensorflow, or "produce TensorFlow code," what you're really asking is, how do you export a TensorFlow graph from a Keras model?

As the Keras author states in this thread,

When you are using the TensorFlow backend, your Keras code is actually building a TF graph. You can just grab this graph.

Keras only uses one graph and one session.

However, he links to a tutorial whose details are now outdated. But the basic concept has not changed.

We just need to:

  • Get the TensorFlow session
  • Export the computation graph from the TensorFlow session

Do it with Keras

The keras_to_tensorflow repository contains a short example of how to export a model from Keras for use in TensorFlow in an iPython notebook. This is basically using TensorFlow. It isn't a clearly-written example, but throwing it out there as a resource.

Do it with TensorFlow

It turns out we can actually get the TensorFlow session that Keras is using from TensorFlow itself, using the tf.contrib.keras.backend.get_session() function. It's pretty simple to do - just import and call. This returns the TensorFlow session.

Once you have the TensorFlow session variable, you can use the SavedModelBuilder to save your computational graph (guide + example to using SavedModelBuilder in the TensorFlow docs). If you're wondering how the SavedModelBuilder works and what it actually gives you, the SavedModelBuilder Readme in the Github repo is a good guide.

P.S. - If you are planning on heavy usage of TensorFlow + Keras in combination, have a look at the other modules available in tf.contrib.keras

Upvotes: 2

Umberto
Umberto

Reputation: 1421

So you want to use instead of WX+b a different function for your neurons. Well in tensorflow you explicitly calculate this product, so for example you do

y_ = tf.matmul(X, W)

you simply have to write your formula and let the network learn. It should not be difficult to implement.

In addition what you are trying to do (according to the paper you link) is called batch normalization and is relatively standard. The idea being you normalize your intermediate steps (in the different layers). Check for example https://www.google.ch/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&ved=0ahUKEwikh-HM7PnWAhXDXRQKHZJhD9EQFggyMAE&url=https%3A%2F%2Farxiv.org%2Fabs%2F1502.03167&usg=AOvVaw1nGzrGnhPhNGEczNwcn6WK or https://www.google.ch/url?sa=t&rct=j&q=&esrc=s&source=web&cd=4&ved=0ahUKEwikh-HM7PnWAhXDXRQKHZJhD9EQFghCMAM&url=https%3A%2F%2Fbcourses.berkeley.edu%2Ffiles%2F66022277%2Fdownload%3Fdownload_frd%3D1%26verifier%3DoaU8pqXDDwZ1zidoDBTgLzR8CPSkWe6MCBKUYan7&usg=AOvVaw0AHLwD_0pUr1BSsiiRoIFc

Hope that helps, Umberto

Upvotes: 1

Related Questions