ronroo
ronroo

Reputation: 595

How can I print the values of Keras tensors?

I am implementing own Keras loss function. How can I access tensor values?

What I've tried

def loss_fn(y_true, y_pred):
    print y_true

It prints

Tensor("target:0", shape=(?, ?), dtype=float32)

Is there any Keras function to access y_true values?

Upvotes: 48

Views: 45446

Answers (9)

calmcc
calmcc

Reputation: 241

If you're using the keras API in TF2, you can do this with a keras Lambda layer. Doing so will prevent tf.print from getting called during model-building, but rather during actual training/inference:

activations_1 = ...
activations_2 = tf.keras.layers.Dense(units=123)(activations_1)
def print_act_2(act_2):
    tf.print("act_2", act_2)
    return act_2
activations_2 = layers.Lambda(print_act_2)(activations_2)

Upvotes: 0

F1refly
F1refly

Reputation: 376

If you are using TensorFlow's keras, you can enable Eager Execution:

import tensorflow as tf 
tf.enable_eager_execution()

Afterwards you can print the tensors in your loss function.

In case you get the error message "ValueError: Only TF native optimizers are supported in Eager mode." and you have used 'adam' as an optimizer for example, you can change the model's compile arguments to

model.compile(optimizer = tf.train.AdamOptimizer(), loss = loss_fn, ...)

Update: TensorFlow 2.x

You only need to enable the "run_eagerly" parameter for Eager Execution of Keras models, as stated in Keras debugging tip 3:

model.compile(..., run_eagerly = True)

Afterwards you can output the tensor in your custom loss function using print(y_true), tf.print(y_true) or K.print_tensor(y_true).

Upvotes: 3

user2585501
user2585501

Reputation: 606

To obtain the output values of arbitrary layer keras tensors ("How can I print the values of Keras tensors?") it appears a different solution is required. To print the output of a single layer (from https://stackoverflow.com/a/65288168/2585501):

from tensorflow.keras import backend as K
layerIndex = 1
func = K.function([model.get_layer(index=0).input], model.get_layer(index=layerIndex).output)
layerOutput = func([input_data])  # input_data is a numpy array
print(layerOutput)

Upvotes: 0

Adi Shumely
Adi Shumely

Reputation: 397

to print the value of a tensor you need the tensor to have value for example:

import tensorflow as tf

aa = tf.constant([1,5,3])
bb = keras.layers.Dense(4, name="my_tensor")
print('aa:',aa)
print('bb:',bb)


aa: tf.Tensor([1 5 3], shape=(3,), dtype=int32)
bb: <tensorflow.python.keras.layers.core.Dense object at 0x000001D4B0137048>

if i want to print b I need to give him a input like this:

aa = tf.constant([[1,5,3]])
bb = keras.layers.Dense(4, name="my_tensor")
print('bb.weights before a assign:',bb.weights,'\n')
print('bb:',bb(aa),'\n')                               
print('bb.weights:',bb.weights)

Output:

bb.weight before a assign: [] 

bb: tf.Tensor([[1.0374807 3.4536252 1.5064619 2.1762671]], shape=(1, 4), dtype=float32) 

bb.weight: [<tf.Variable 'my_tensor/kernel:0' shape=(3, 4) dtype=float32, numpy=
array([[ 0.885918  , -0.88332534, -0.40944284, -0.04479438],
       [-0.27336687,  0.34549594, -0.11853147,  0.15316617],
       [ 0.50613236,  0.8698236 ,  0.83618736,  0.4850769 ]],
      dtype=float32)>, <tf.Variable 'my_tensor/bias:0' shape=(4,) dtype=float32, numpy=array([0., 0., 0., 0.], dtype=float32)>]

If bb is a tensor inside a model or a tensor that the size of the input is fix this will not work

inputs = keras.Input(shape=(3,), name="inputs")
b = keras.layers.Dense(4, name="my_tensor")(inputs)

a = tf.constant([[1,5,3]])
print('b:',b(a),'\n')

Output:

TypeError: 'tensorflow.python.framework.ops.EagerTensor' object is not callable

i use feature_extractor to fix it:

inputs = keras.Input(shape=(3,), name="inputs")
bb = keras.layers.Dense(4, name="my_tensor")(inputs)

feature_extractor = keras.Model(
    inputs=inputs,
    outputs=bb,
)

aa = tf.constant([[1,5,3]])
print('feature_extractor:',feature_extractor(aa),'\n')

Output:

feature_extractor: tf.Tensor([[-4.9181094  4.956725  -1.8055304  2.6975303]], shape=(1, 4), dtype=float32) 



Upvotes: 1

remykarem
remykarem

Reputation: 2490

You could redefine your loss function to return the value instead:

def loss_fn(y_true, y_pred):
    return y_true

Let's create some tensors:

from keras import backend as K

a = K.constant([1,2,3])
b = K.constant([4,5,6])

And use the keras.backend.eval() API to evaluate your loss function:

loss = loss_fn(a,b)
K.eval(loss)
# array([1., 2., 3.], dtype=float32)

Upvotes: 3

Peter Svehla
Peter Svehla

Reputation: 54

I use

print("y_true = " + str(y_true.eval()))

for debugging.

Upvotes: 0

nroulet
nroulet

Reputation: 349

Keras' backend has print_tensor which enables you to do this. You can use it this way:

import keras.backend as K

def loss_fn(y_true, y_pred):
    y_true = K.print_tensor(y_true, message='y_true = ')
    y_pred = K.print_tensor(y_pred, message='y_pred = ')
    ...

The function returns an identical tensor. When that tensor is evaluated, it will print its content, preceded by message. From the Keras docs:

Note that print_tensor returns a new tensor identical to x which should be used in the following code. Otherwise the print operation is not taken into account during evaluation.

So, make sure to use the tensor afterwards.

Upvotes: 34

H. M. Tarek Ullah
H. M. Tarek Ullah

Reputation: 126

You can't get the values from the tensor symbolic variable directly. Yo need to write a theano function to extract the value. Don't forget to choose theano as backend of Keras.

Check the notebook link to get some basic of theano variables and functions : get tensor value in call function of own layers

Upvotes: 0

Igor Poletaev
Igor Poletaev

Reputation: 339

Usually, y_true you know in advance - during preparation of your train corpora...

However, there's one trick to see the values inside y_true and/or y_pred. Keras gives you an opportunity to write respective callback for printing the neural network's output. It will look something like this:

def loss_fn(y_true, y_pred):
    return y_true # or y_pred
...
import keras.callbacks as cbks
class CustomMetrics(cbks.Callback):

    def on_epoch_end(self, epoch, logs=None):
        for k in logs:
            if k.endswith('loss_fn'):
               print logs[k]

Here the loss_fn is name of your loss function when you pass it into the model.compile(...,metrics=[loss_fn],) function during model's compilation.

So, finally, you have to pass this CustomMetrics callback as the argument into the model.fit():

model.fit(x=train_X, y=train_Y, ... , callbacks=[CustomMetrics()])

P.S.: If you use Theano (or TensorFlow) like here in Keras, you write a python program, and then you compile it and execute. So, in your example y_true - is just a tensor variable which is used for further compilation and loss function counting.

It means that there's no way to see the values inside it. In Theano, for example, you can look inside the only so-called shared variable after the execution of respective eval() function. See this question for more info.

Upvotes: 5

Related Questions