Reputation: 23
I have defined two Variables biases for weights and biases. How do I use those variables in Keras? Basically, what I am trying to do is as follows:
w = tf.get_variable("weight", shape=[784, 512], trainable=True)
b = tf.get_variable("bias", shape=[512], trainable=True)
model = Sequential()
model.add(Dense(512, activation='relu', input_shape=(784,), weights=w, biases=b))
Does anyone know how to do this with Keras?
Upvotes: 2
Views: 913
Reputation: 19836
Pass in a Numpy array directly, Keras will handle tensor conversion for you; also, weights
handles both the 'regular' weights, and biases. Full example below:
from keras.layers import Dense
from keras.models import Sequential
import numpy as np
input_shape = (784,)
dense_dim = 512
W = np.random.randn(input_shape[0], dense_dim)
b = np.random.randn(dense_dim)
model = Sequential()
model.add(Dense(dense_dim, activation='relu', input_shape=input_shape, weights=[W, b]))
Be sure to pass in the weights in order which the layer expects them - which can be inspected directly:
print(model.layers[0].weights)
[<tf.Variable 'dense_1/kernel:0' shape=(784, 512) dtype=float32_ref>,
<tf.Variable 'dense_1/bias:0' shape=(512,) dtype=float32_ref>]
Set weights after building model: use layer.set_weights()
:
model.layers[0].set_weights([W, b]) # again, mind the order
Using tf.get_variable: can't do; from set_weights()
source code, K.batch_set_value
is used, which operates on raw array values rather than Tensors. If your goal is to track a layer's weight variables, you can simply fetch directly, and use K.eval()
to get their values (or .numpy()
for TF2):
import keras.backend as K
dense1_weights, dense1_biases = model.layers[0].weights
if tf.__version__[0] == '2':
print(dense1_weights.numpy())
else:
print(K.eval(dense1_weights))
Upvotes: 1