Flavio Spadavecchia
Flavio Spadavecchia

Reputation: 21

Embeddings in sentiment analysis task with PyTorch and Multilayer Perception

I have the body of the code, which should work fine, I think it doesn't because of something I'm messing up here, probably having to do with the embedding.

import torch.nn as nn
class MultilayerPerceptron(nn.Module):

  def __init__(self, input_size, hidden_size): # I removed output size
    # Call initializer function of the super class 
    super(MultilayerPerceptron, self).__init__()
    self.embedding = nn.Embedding(INPUT_DIM, EMBEDDING_DIM) #added this myself, maybe wrong
    #self.mlp = nn.MultilayerPerceptron(EMBEDDING_DIM, HIDDEN_DIM) #also added
    self.INPUT_DIM = INPUT_DIM
    self.HIDDEN_DIM = HIDDEN_DIM
    self.OUTPUT_DIM = OUTPUT_DIM
    self.EMBEDDING_DIM = EMBEDDING_DIM

    #whenever this model is called, those layers in the sequential block 
    #will be processed in the order given to the block. 
    self.model = nn.Sequential(
        #nn.Flatten(), # adding this hopefully it works (it didn't)
        #embeds = embedded.mean(dim=1) #god help
        nn.Linear(self.INPUT_DIM, self.HIDDEN_DIM), #later on, multiply by embedding dimensionality #I removed 
        nn.ReLU(),
        nn.Linear(self.HIDDEN_DIM, self.OUTPUT_DIM), #one layer neural network
        nn.ReLU(), # do I need this?
        nn.Sigmoid(),
    )
    
  def forward(self, x):
    embedded = self.embedding(x)
    #embedded = [sent len, batch size, emb dim]
    output, hidden = self.model(embedded) 
    output = self.model(x) #call the model defined above for forward propagation. 
    return output

INPUT_DIM = len(TEXT.vocab)
EMBEDDING_DIM = 100 #how do I fit this into the model??
HIDDEN_DIM = 256
OUTPUT_DIM = 1

model = MultilayerPerceptron(INPUT_DIM, HIDDEN_DIM) #MLP instead 

The error I get is "mat1 and mat2 shapes cannot be multiplied (50176x100 and 25002x256)".

Upvotes: 1

Views: 177

Answers (0)

Related Questions