Reputation: 4940
I have built a neural network with Keras. I would visualize its data by Tensorboard, therefore I have utilized:
keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,
write_graph=True, write_images=True)
as explained in keras.io. When I run the callback I get <keras.callbacks.TensorBoard at 0x7f9abb3898>
, but I don't get any file in my folder "Graph". Is there something wrong in how I have used this callback?
Upvotes: 160
Views: 131834
Reputation: 11553
keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0,
write_graph=True, write_images=True)
This line creates a Callback Tensorboard object, you should capture that object and give it to the fit
function of your model.
tbCallBack = keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)
...
model.fit(...inputs and parameters..., callbacks=[tbCallBack])
This way you gave your callback object to the function. It will be run during the training and will output files that can be used with tensorboard.
If you want to visualize the files created during training, run in your terminal
tensorboard --logdir path_to_current_dir/Graph
Upvotes: 241
Reputation: 8388
If you are using google-colab simple visualization of the graph would be :
import tensorboardcolab as tb
tbc = tb.TensorBoardColab()
tensorboard = tb.TensorBoardColabCallback(tbc)
history = model.fit(x_train,# Features
y_train, # Target vector
batch_size=batch_size, # Number of observations per batch
epochs=epochs, # Number of epochs
callbacks=[early_stopping, tensorboard], # Early stopping
verbose=1, # Print description after each epoch
validation_split=0.2, #used for validation set every each epoch
validation_data=(x_test, y_test)) # Test data-set to evaluate the model in the end of training
Upvotes: 5
Reputation: 10669
Create the Tensorboard callback:
from keras.callbacks import TensorBoard
from datetime import datetime
logDir = "./Graph/" + datetime.now().strftime("%Y%m%d-%H%M%S") + "/"
tb = TensorBoard(log_dir=logDir, histogram_freq=2, write_graph=True, write_images=True, write_grads=True)
Pass the Tensorboard callback to the fit call:
history = model.fit(X_train, y_train, epochs=200, callbacks=[tb])
When running the model, if you get a Keras error of
"You must feed a value for placeholder tensor"
try reseting the Keras session before the model creation by doing:
import keras.backend as K
K.clear_session()
Upvotes: 4
Reputation: 10888
Here is some code:
K.set_learning_phase(1)
K.set_image_data_format('channels_last')
tb_callback = keras.callbacks.TensorBoard(
log_dir=log_path,
histogram_freq=2,
write_graph=True
)
tb_callback.set_model(model)
callbacks = []
callbacks.append(tb_callback)
# Train net:
history = model.fit(
[x_train],
[y_train, y_train_c],
batch_size=int(hype_space['batch_size']),
epochs=EPOCHS,
shuffle=True,
verbose=1,
callbacks=callbacks,
validation_data=([x_test], [y_test, y_test_coarse])
).history
# Test net:
K.set_learning_phase(0)
score = model.evaluate([x_test], [y_test, y_test_coarse], verbose=0)
Basically, histogram_freq=2
is the most important parameter to tune when calling this callback: it sets an interval of epochs to call the callback, with the goal of generating fewer files on disks.
So here is an example visualization of the evolution of values for the last convolution throughout training once seen in TensorBoard, under the "histograms" tab (and I found the "distributions" tab to contain very similar charts, but flipped on the side):
In case you would like to see a full example in context, you can refer to this open-source project: https://github.com/Vooban/Hyperopt-Keras-CNN-CIFAR-100
Upvotes: 11
Reputation: 259
If you are working with Keras library and want to use tensorboard to print your graphs of accuracy and other variables, Then below are the steps to follow.
step 1: Initialize the keras callback library to import tensorboard by using below command
from keras.callbacks import TensorBoard
step 2: Include the below command in your program just before "model.fit()" command.
tensor_board = TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)
Note: Use "./graph". It will generate the graph folder in your current working directory, avoid using "/graph".
step 3: Include Tensorboard callback in "model.fit()".The sample is given below.
model.fit(X_train,y_train, batch_size=batch_size, epochs=nb_epoch, verbose=1, validation_split=0.2,callbacks=[tensor_board])
step 4 : Run your code and check whether your graph folder is there in your working directory. if the above codes work correctly you will have "Graph" folder in your working directory.
step 5 : Open Terminal in your working directory and type the command below.
tensorboard --logdir ./Graph
step 6: Now open your web browser and enter the address below.
http://localhost:6006
After entering, the Tensorbaord page will open where you can see your graphs of different variables.
Upvotes: 16
Reputation: 136177
This is how you use the TensorBoard callback:
from keras.callbacks import TensorBoard
tensorboard = TensorBoard(log_dir='./logs', histogram_freq=0,
write_graph=True, write_images=False)
# define model
model.fit(X_train, Y_train,
batch_size=batch_size,
epochs=nb_epoch,
validation_data=(X_test, Y_test),
shuffle=True,
callbacks=[tensorboard])
Upvotes: 50
Reputation: 3882
There are few things.
First, not /Graph
but ./Graph
Second, when you use the TensorBoard callback, always pass validation data, because without it, it wouldn't start.
Third, if you want to use anything except scalar summaries, then you should only use the fit
method because fit_generator
will not work. Or you can rewrite the callback to work with fit_generator
.
To add callbacks, just add it to model.fit(..., callbacks=your_list_of_callbacks)
Upvotes: 2
Reputation: 1208
You should check out Losswise (https://losswise.com), it has a plugin for Keras that's easier to use than Tensorboard and has some nice extra features. With Losswise you'd just use from losswise.libs import LosswiseKerasCallback
and then callback = LosswiseKerasCallback(tag='my fancy convnet 1')
and you're good to go (see https://docs.losswise.com/#keras-plugin).
Upvotes: 2
Reputation: 169
You wrote log_dir='/Graph'
did you mean ./Graph
instead? You sent it to /home/user/Graph
at the moment.
Upvotes: 2
Reputation: 319
Change
keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,
write_graph=True, write_images=True)
to
tbCallBack = keras.callbacks.TensorBoard(log_dir='Graph', histogram_freq=0,
write_graph=True, write_images=True)
and set your model
tbCallback.set_model(model)
Run in your terminal
tensorboard --logdir Graph/
Upvotes: 21