Tony G
Tony G

Reputation: 55

LSTM Embedding output size and No. of LSTM

I am not sure why we have only output vector of size 32, while have LSTM 100?

What I am confuse is that if we have only 32 words vector, if fetch into LSTM, 32 LSTM should big enough to hold it?

Model.add(Embedding(5000,32)
Model.add(LSTM(100)) 

Upvotes: 1

Views: 245

Answers (1)

nuric
nuric

Reputation: 11225

Those are hyper-parameters of your model and there is no best way of setting them without experimentation. In your case, embedding single words into a vector of dimension 32 might be enough, but the LSTM will process a sequence of them and might require more capacity (ie dimensions) to store information about multiple words. Without knowing the objective or the dataset it is difficult to make an educated guess on what those parameters would be. Often we look at past research papers tackling similar problems and see what hyper-parameters they used and then tune them via experimentation.

Upvotes: 1

Related Questions