Reputation: 5499
In keras Sequential model, one can set weight directly using set_weights
method.
model.layers[n].set_weights([your_wight])
However I am facing problem if I am trying to set weight to a layer using functional API.
Here is the code snippet:
emb = Embedding(max_words, embedding_dim, input_length=maxlen)(merge_ip)
#skipping some lines
.
.
emb.set_weights([some_weight_matrix])
This is throwing error that
AttributeError: 'Tensor' object has no attribute 'set_weights'
which I think becouse emb
is a Tensor object.
I am wondering how to set wight properly in my model
Upvotes: 1
Views: 1822
Reputation: 24099
If you want to set the weights on Embedding layers you might add them to the constructor like this:
from keras.layers import Embedding
embedding_layer = Embedding(len(word_index) + 1,
EMBEDDING_DIM,
weights=[embedding_matrix],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)
https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html
Later then you can hand over merge_ip
:
x = embedding_layer(merge_ip)
Upvotes: 5
Reputation: 1318
embed_layer = Embedding(max_words, embedding_dim, input_length=maxlen)
emp = embed_layer(merge_ip)
embed_layer.set_weights("...")
Upvotes: 1