Reputation: 3321
I am trying to understand how keras/tensorflow work.
In this example I'm working with, an LSTM
network with a defined loss
function.
I want to print the values in y_pred
and loss
variables in this example, however a standard print()
function will not print the actual numeric values.
When I try print()
function, I get the following output: Tensor("loss_13/dense_14_loss/strided_slice:0", shape=(), dtype=float32)
import tensorflow as tf
from tensorflow.keras import Sequential, backend as K
from tensorflow.keras.layers import Dense, LSTM, Dropout
from tensorflow.keras.losses import categorical_crossentropy
regressor = Sequential()
regressor.add(LSTM(units = 10, dropout=0.10, return_sequences = True, input_shape = (X.shape[1], X.shape[2])))
regressor.add(Dense(units = 4, activation='softmax'))
regressor.compile(optimizer = optimizer, loss = weight_fx(np.array([0.005,0.20,0.79,0.005])), metrics = ['categorical_accuracy'])
def weight_fx(weights):
weights = K.variable(weights)
def loss(y_true, y_pred):
y_pred /= K.sum(y_pred, axis=-1, keepdims=True)
print(y_pred)
loss = y_true * K.log(y_pred) * weights
return loss
return loss
Upvotes: 1
Views: 256
Reputation: 938
Try doing it like this:
import tensorflow as tf
from tensorflow.keras import Sequential, backend as K
from tensorflow.keras.layers import Dense, LSTM, Dropout
from tensorflow.keras.losses import categorical_crossentropy
import numpy as np
X = tf.ones((10,10,10))
y = tf.ones((10,1))
def weight_fx(weights):
weights = K.variable(weights)
def loss(y_true, y_pred):
y_pred /= K.sum(y_pred, axis=-1, keepdims=True)
tf.print(y_pred)
loss = y_true * K.log(y_pred) * weights
return loss
return loss
regressor = Sequential()
regressor.add(LSTM(units = 10, dropout=0.10, return_sequences = True,
input_shape = (X.shape[1], X.shape[2])))
regressor.add(Dense(units = 4, activation='softmax'))
regressor.compile(optimizer = 'adam', loss = weight_fx(np.array([0.005,0.20,0.79,0.005])), metrics = ['categorical_accuracy'])
regressor.fit(X,y)
Tensor("loss_13/dense_14_loss/strided_slice:0", shape=(), dtype=float32)
?print()
. What you see there is a detector or a 'tracer'. It only runs once.print
-debug, use tf.print
. In my experience this sometimes works, and sometimes doesn't. In case when it doesn't, and you still see the detector variable only, use the model.run_eagerly = True
or pass it as an argument in model.compile
. Even if you do not use tf.print
and set run_eagerly
, python's built-in print
will still work (Try this).
Last but not least, you can wrap all your side-effect functions in a tf.py_function
. This requires a bit more code and a sample copy-and-paste code can be seen here.Also, Make sure to first define the function and then use it in the model.compile
, especially if you are using Jupyter notebook. A buggy old declaration might still persist in memory and will probably ruin your day.
Did this help?
Upvotes: 3
Reputation: 390
I haven't tried this yet but you should always use:
tf.print(value)
Instead of just normal
print(value)
Tensorflow has implemented this function specifically for this. Hopefully, this helps!
Upvotes: 1