Reputation: 43491
My keras
model is:
model = Sequential()
model.add(Embedding(input_dim=(self.BATCH_SIZE,
len(self._tokens)), output_dim=1024))
model.add(LSTM(128))
model.add(Dropout(rate=0.5))
model.add(Dense(len(self._tokens)))
model.add(Activation('softmax'))
and I get an error:
ValueError: Argument must be a dense tensor: ((10, 4945), 1024) - got shape [2], but wanted [2, 2].
I'm not sure what I'm doing incorrectly. Any help would be appreciated.
Upvotes: 1
Views: 699
Reputation: 8699
As per the keras official Documentation, input_dim parameter of embedding layer should be the size of the vocabulary, i.e. maximum integer index + 1 (int>0).
So, you code should be:
model.add(Embedding(input_dim=len(self._tokens), output_dim=1024))
In case, you haven't added input_length and input_shape parameters in the embedding layer, then
input_shape = (None,)
else
input_shape = (input_length,) # added 'input_length=' parameter
For more info check the official code here.
Upvotes: 1