Ilya
Ilya

Reputation: 581

Making custom multihot embedding layer in keras/tf

I'd like to make a custom embedding layer in keras, but not sure how to go about it.

As input I would pass for each example a variable number of integers (indices, from which I would like to generate a fixed size vector). A numpy version (that has batch_size = 1) of this embedding would be:

class numpyEmbedding():

    def __init__(self,vocab_size):
        self.vocab_size  = vocab_size
        self.build()


    def build(self):
        self.W = np.eye(self.vocab_size,dtype=np.int8)

    def __call__(self,x):
         return np.sum(self.W[:,x],axis=-1)

I imagine a keras version of this layer should be possible but I am not sure how to get it working and what considerations I need to have since it would have to be applied on mini-batches of arrays rather than single arrays.

Thanks!

Ilya

Edit:

Example input:

vec = np.random.choice(np.arange(10),100).astype(int)
emb=numpyEmbedding(int(10))(vec)

Output:

array([11, 10, 11,  9,  8,  9, 13, 12,  6, 11])

Upvotes: 1

Views: 1257

Answers (1)

Ilya
Ilya

Reputation: 581

I was able to figure out the answer

class MultihotEmbedding(layers.Layer):

    def __init__(self, vocab_size, **kwargs):
        self.vocab_size = vocab_size
        super(MultihotEmbedding, self).__init__(**kwargs)

    def call(self, x):
        self.get_embeddings = K.one_hot(x,num_classes=self.vocab_size)
        self.reduce_embeddings = K.sum(self.get_embeddings,axis = -2)
        return self.reduce_embeddings

    def compute_output_shape(self, input_shape):
        return (input_shape[0], self.vocab_size)

Upvotes: 3

Related Questions