Reputation: 2337
I am currently doing some experiments on modifying the weights and not of the bias for each convolutional layers of a model.
For each of the layers of the model, I used layer.get_weights()[0]
to get the weights. After modifying the value of the weights for that particular layer I wanted to set the weights back to the corresponding layer.
I wanted to use the set_weights()
method for that purpose, but however, it takes input both weights and bias, so I could not achieve that. What is the simplest method to set the weight values back to the layers of the model keeping the bias the same as it is?.
I am just a beginner and if the question is not appropriate kindly give me some suggestions and ideas.
Upvotes: 1
Views: 75
Reputation: 81
layer.get_weights()
returns list of numpy arrays. Element 0 is weights, element 1 - biases. Actually, I don't remember and can't check it right now, can this list contain something else, but it is not important in your situation, I guess.
So you can do something like:
params = layer.get_weights()
weights = params[0]
biases = params[1]
my_weights = <your modifications>
layer.set_weights([my_weights, biases])
Upvotes: 1