Reputation: 75
I am trying to do sentiment analysis on a german tweet-data-set with the bert-base-german-cased modell which i imported over transformers from hugginface.
To be able to calculate the predicted probabilities i want to Softmax of Numpy and here does the issue begin.
F.softmax(model(input_ids, attention_mask), dim=1)
I got the error:
ValueError: not enough values to unpack (expected 2, got 1)
Does anyone know, which values are here expected?
All works when i try to run it with:
self.bert = BertModel.from_pretrained(PRE_TRAINED_MODEL_NAME)
getting the error when i switch to
self.bert = AutoModelWithLMHead.from_pretrained("bert-base-german-cased")
As you can probaly see, i am a noob. therefore I please ask for simple and detailed explanations (understandable for a fish :D).
Input_ID' and 'Attention_mask' are output values of the tokenizations process.
Upvotes: 5
Views: 10896
Reputation: 31
BertModel expects a batch of training instances (e.g. input_id [[...][...]]). Hence, there should be no problem if you first batch your dataset (with sth like DataLoader) and iterate over it.
It seems like you given a single training instance (e.g. input_id [...]) for now.
Upvotes: 1
Reputation: 1578
It's a late answer but may help.
I had the same error. My problem was that 'input_ids' and 'attention_mask' have to be 2D tensor but I got them as 1D tensor. So do
input_ids = input_ids.unsqueeze(0)
attention_mask = attention_mask.unsqueeze(0)
in your case.
Upvotes: 12