dter
dter

Reputation: 1165

Multiple embedding layers in keras

With pretrained embeddings, we can specify them as weights in keras' embedding layer. To use multiple embeddings, would specifying multiple embedding layer be suitable? i.e.

embedding_layer1 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_1],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 embedding_layer2 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_2],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 model.add(embedding_layer1)
 model.add(embedding_layer2)

This suggests to sum them up and represent them into a single layer, which is not what I am after.

Upvotes: 5

Views: 5890

Answers (2)

brain pinky
brain pinky

Reputation: 369

if you want to use multiple embedding layers in a model the answer is in this thread Multiple Embedding layers for Keras Sequential model

Upvotes: 0

anusha kamath
anusha kamath

Reputation: 23

I have come across the same issue.Is it because keras.Embedding layer internally uses some kind of object (lets call it x_object ) ,that gets initialized in keras.backend global session K. Hence the second embedding layer throws an exception saying the x_object name already exists in graph and cannot be added again.

Upvotes: 1

Related Questions