user11619814
user11619814

Reputation: 439

What is the meaning of hidden_dim and embed_size in LSTM?

I am trying to learn RNN and LSTM. I cam across a tutorial for sentiment analysis. Below is the code I in the tutorial where word2idx is a dictionary with word to index mapping

    class SentimentNet(nn.Module):
        def __init__(self, vocab_size, output_size, embedding_dim, hidden_dim, n_layers, drop_prob=0.5):
        super(SentimentNet, self).__init__()
        self.output_size = output_size
        self.n_layers = n_layers
        self.hidden_dim = hidden_dim

        self.embedding = nn.Embedding(vocab_size, embedding_dim)
        self.lstm = nn.LSTM(embedding_dim, hidden_dim, n_layers, dropout=drop_prob, batch_first=True)
        self.dropout = nn.Dropout(drop_prob)
        self.fc = nn.Linear(hidden_dim, output_size)
        self.sigmoid = nn.Sigmoid()

     vocab_size = len(word2idx) + 1
     output_size = 1
     embedding_dim = 400
     hidden_dim = 512
     n_layers = 2

Can anyone please tell me the meaning of vocal_size, embedding_dim, hidden_dim?

Upvotes: 1

Views: 7712

Answers (1)

Nikaido
Nikaido

Reputation: 4629

A recurrent neural network (LSTM), at its most fundamental level, is simply a type of densely connected neural network.

The hidden dimension is basically the number of nodes in each layer (like in the Multilayer Perceptron for example)

The embedding size tells you the size of your feature vector (the model uses embedded words as input)

here some details

Upvotes: 3

Related Questions