Pablo Castilla
Pablo Castilla

Reputation: 2741

logits and labels must have the same first dimension

I am trying to create a recipe generator at kaggle using tensorflow and lstm. But I am totally stuck in something related to dimesions. Can someone point me out in the right direction?

https://www.kaggle.com/pablocastilla/d/kaggle/recipe-ingredients-dataset/ingredients-recomender-using-lstm-with-tensorflow/run/1066831

Thanks so much!

Upvotes: 0

Views: 5489

Answers (2)

kafman
kafman

Reputation: 2860

Here's an excerpt from the implementation of seq2seq.sequence_loss(logits, targets, weights), which you use in your code:

with ops.name_scope(name, "sequence_loss", [logits, targets, weights]):
    num_classes = array_ops.shape(logits)[2]
    logits_flat = array_ops.reshape(logits, [-1, num_classes])
    targets = array_ops.reshape(targets, [-1])
    if softmax_loss_function is None:
      crossent = nn_ops.sparse_softmax_cross_entropy_with_logits(
labels=targets, logits=logits_flat)

I believe the error you see is stemming from the last line in that code. The error message is self-explanatory:

InvalidArgumentError: logits and labels must have the same first dimension, got logits shape [8,6714] and labels shape [2]

I.e. the size of the first dimension of logits_flat and targets must be the same. This translates directly to your input to seq2seq.sequence_loss: The first two dimensions of your targets and logits variable must be equal. So, either you are not using the same number of batches for the two variables or somehow your sequence length changed (which would be weird though).

Upvotes: 1

Pietro Tortella
Pietro Tortella

Reputation: 1114

I think the issue is that

training_batches[0][1] 

is a list and not a numpy.array, you should modify create_datasets accordingly...

Upvotes: 1

Related Questions