james_24
james_24

Reputation: 99

Issue with embedding pre-trained model in Keras

I have a pre-trained Fasttext model and I want to embed it in Keras.

model = Sequential()
model.add(Embedding(MAX_NB_WORDS, 
                    EMBEDDING_DIM, 
                    input_length=X.shape[1],
                    input_length=4,
                    weights=[embedding_matrix],
                    trainable=False))

But it didn't work.

I found that lots of people have same problems with embedding pre-trained model to Keras, and all of them are left with no solution.

It seems like weights and embeddings_initializer are deprecated.

Is there any alternative method to solve the problem? Thanks in advance

Upvotes: 0

Views: 678

Answers (1)

user8879803
user8879803

Reputation:

Weights parameter is deprecated in Embedding layer of Keras.

The new version of embedding layer will look like below -

embedding_layer = Embedding(num_words,
                            EMBEDDING_DIM,
                            embeddings_initializer=Constant(embedding_matrix),
                            input_length=MAX_SEQUENCE_LENGTH,
                            trainable=False)

You can find latest version of embedding layer details here - Keras Embedding Layer

You can find the example of pretrained word embedding here - Pretrained Word Embedding

Upvotes: 3

Related Questions