Peter Force
Peter Force

Reputation: 449

Is there a way to update the weights of a layer/variable directly after a training step?

There are ways to get and update the weight directly outside of training, but what I am looking to do is after the each training step and after the gradients have updated a particular variable or layer, I would like to save those weights to file and then replace the layer or variable weights with brand new ones. And then continue with the next step (forward pass using the new values in the variable or layer, then backward pass with the calculated loss/gradients) of training.

I have thought about just calling each training step individually, but I am wondering if that is somehow very time/memory inefficient.

Upvotes: 2

Views: 1013

Answers (1)

Daniel Möller
Daniel Möller

Reputation: 86650

You can try to use a Callback to do that.

Define the function you want:

def afterBatch(batch, logs):
    model.save_weights('weights'+str(batch)) #maybe you want also a method to save the current epoch....

    #option 1
    model.layers[select_a_layer].set_weights(listOfNumpyWeights) #not sure this works

    #option 2
    K.set_value(model.layers[select_a_layer].kernel, newValueInNumpy) 
       #depending on the layer you have kernel, bias and maybe other weight names 
       #take a look at the source code of the layer you are changing the weights

Use a LambdaCallback:

from keras.callbacks import LambdaCallback

batchCallback = LambdaCallback(on_batch_end = aterBatch)
model.fit(x, y, callbacks = [batchCallback, ....])

There are weight updates after every batch (this might be too much if you are saving weights every time). You can also try on_epoch_end instead of on_batch_end.

Upvotes: 3

Related Questions