acdr
acdr

Reputation: 4726

How to see the values of a keras layer's variables' slots

Using keras (tf2), I'm writing my own optimiser. In this optimiser, I keep track of the moving average of the squares of all the weights, and of all the gradients. I store these in so-called slots, created as follows:

    def _create_slots(self, var_list):
        # Separate for-loops required for some reason
        for var in var_list:
            self.add_slot(var, 'squared_gradient_ma', initializer="zeros")
        for var in var_list:
            self.add_slot(var, 'squared_param_ma', initializer="zeros")

I update these in the _resource_apply_dense method, as follows:

    def _resource_apply_dense(self, grad, var):
        squared_gradient_ma = self.get_slot(var, 'squared_gradient_ma')
        squared_param_ma = self.get_slot(var, 'squared_param_ma')
        ma_decay = 1 / 1000000

        new_squared_gradient_ma = state_ops.assign(
            squared_gradient_ma,
            (1.0 - ma_decay) * squared_gradient_ma + ma_decay * math_ops.square(grad),
            use_locking=self._use_locking,
            )
        
        new_squared_param_ma = state_ops.assign(
            squared_param_ma,
            (1.0 - ma_decay) * squared_param_ma + ma_decay * math_ops.square(var),
            use_locking=self._use_locking,
            )

(Naturally, there's also code to update the actual weights, not shown.)

This makes it so that these two moving averages are updated on every training batch.

Now, at the end of training, I can inspect the weights that the model learned, by calling layer.get_weights(). Is there a similar thing I can call to see the values of these two slots that I made?

Upvotes: 0

Views: 203

Answers (1)

Lescurel
Lescurel

Reputation: 11651

Assuming that your optimizer is subclassed from tf.keras.optimizers.Optimizer, and that you are training with model.fit, then you can access the optimizer weights with:

model.optimizer.get_weights()

Upvotes: 1

Related Questions