nurasaki
nurasaki

Reputation: 109

Record time steps in trainig sequential tensorflow and keras model

I'm training a sequential model with tensorflow and keras.

# create the Sequential model
model = Sequential()

# add layers
model.add(Dense(1, activation='relu', input_shape=(2,)))
model.add(Dense(1, activation='sigmoid'))

# configure the network
model.compile(optimizer=SGD(lr=0.1), loss='mean_squared_error', metrics=['accuracy'])

# train the network
history = model.fit(X_train, y_train, validation_split=0.2, epochs=500, verbose=1)

Where the verbose is:

Epoch 1/500
2/2 [==============================] - 0s 159ms/step - loss: 0.2474 - accuracy: 0.4499 - val_loss: 0.2482 - val_accuracy: 0.5714
Epoch 2/500
2/2 [==============================] - 0s 40ms/step - loss: 0.2469 - accuracy: 0.4980 - val_loss: 0.2478 - val_accuracy: 0.7857
Epoch 3/500
2/2 [==============================] - 0s 36ms/step - loss: 0.2459 - accuracy: 0.6568 - val_loss: 0.2474 - val_accuracy: 0.7143
Epoch 4/500
2/2 [==============================] - 0s 44ms/step - loss: 0.2459 - accuracy: 0.6820 - val_loss: 0.2470 - val_accuracy: 0.7143
Epoch 5/500
2/2 [==============================] - 0s 49ms/step - loss: 0.2453 - accuracy: 0.7636 - val_loss: 0.2468 - val_accuracy: 0.7143

...

I can retrieve "loss/accuracy" history:

# Returns [0.24733777344226837, 0.24695482850074768, 0.24644054472446442, ... ]
history.history['loss']

# Returns [0.4716981053352356, 0.5283018946647644, 0.6415094137191772, ... ]
history.history['accuracy']

Is there a similar way to retrieve times used in each step? Something like:

# Should return 
[159, 40, 36, 44 ... ]

Thanks!

Upvotes: 0

Views: 461

Answers (1)

mujjiga
mujjiga

Reputation: 16906

They are not part of the callback but rather via progress bar which is not exposed outside. However, you can write a custom callback and calculate these values. Note that the steps shown is the number of batches while training.

Sample Code:

class MyCallback(keras.callbacks.Callback):
    def on_train_begin(self, logs={}):
        self.times = []
                
    def on_epoch_begin(self, epoch, logs=None):
        self.current_batch_times = []
        
    def on_train_batch_begin(self, batch, logs=None):
        self.start = time.time()

    def on_train_batch_end(self, batch, logs=None):
        self.current_batch_times.append(time.time() - self.start)
    
    def on_epoch_end(self, epoch, logs=None):
        self.times.append(np.mean(self.current_batch_times))


my_callback = MyCallback()
history = model.fit(np.random.randn(32*100,2), np.random.randn(32*100,1), 
                    validation_split=0.2, epochs=5, verbose=1, batch_size=32,
                    callbacks=[my_callback])

print ([np.round(t*1000) for t in my_callback.times])

Output:

[962.0, 938.0, 918.0, 932.0, 912.0]

Upvotes: 1

Related Questions