soulwreckedyouth
soulwreckedyouth

Reputation: 595

Bert Transformer "Size Error" while Machine Traslation

I am getting desperate as I have no clue what is the problem over here. I want to translate a list of sentences from german to english. This is my code:


from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-de-en")
model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-de-en")

batch = tokenizer(
    list(data_bert[:100]),
    padding=True,
    truncation=True,
    max_length=250,
    return_tensors="pt"
)



results = model(batch)

And I am getting this error:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
~/miniconda3/envs/textmallet/lib/python3.9/site-packages/transformers/tokenization_utils_base.py in __getattr__(self, item)
    247         try:
--> 248             return self.data[item]
    249         except KeyError:

KeyError: 'size'

During handling of the above exception, another exception occurred:

AttributeError                            Traceback (most recent call last)
/tmp/ipykernel_26502/2652187977.py in <module>
     14 
     15 
---> 16 results = model(batch)
     17 

~/miniconda3/envs/textmallet/lib/python3.9/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
   1049         if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1050                 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1051             return forward_call(*input, **kwargs)
   1052         # Do not call functions when jit is used
   1053         full_backward_hooks, non_full_backward_hooks = [], []

~/miniconda3/envs/textmallet/lib/python3.9/site-packages/transformers/models/marian/modeling_marian.py in forward(self, input_ids, attention_mask, decoder_input_ids, decoder_attention_mask, head_mask, decoder_head_mask, cross_attn_head_mask, encoder_outputs, past_key_values, inputs_embeds, decoder_inputs_embeds, labels, use_cache, output_attentions, output_hidden_states, return_dict)
   1274                 )
   1275 
-> 1276         outputs = self.model(
   1277             input_ids,
   1278             attention_mask=attention_mask,

I have no clue what could be the precise issue over here. If someone can help me out I d be really thankful.

Upvotes: 2

Views: 2199

Answers (2)

Ivan Sviridov
Ivan Sviridov

Reputation: 63

For me, passing model(**batch) instead of model(batch) helped. Also, about Timbus Calin answer: in some cases, using slice with only input_ids key may not be enough since tokenizer output contains other valuable items for model such as attention_mask, which tells model about padded elements in input sequence, etc (more details about tokenizer outputs here).

Upvotes: 0

Timbus Calin
Timbus Calin

Reputation: 15063

In the problem described here (credits to LysandreJik): https://github.com/huggingface/transformers/issues/5480, the problem appears to be the data type of a dict instead of tensor.

It might be the case that you need to change the tokenizer output from:

batch = tokenizer(
    list(data_bert[:100]),
    padding=True,
    truncation=True,
    max_length=250,
    return_tensors="pt"
)

TO:

batch = tokenizer(
    list(data_bert[:100]),
    padding=True,
    truncation=True,
    max_length=250,
    return_tensors="pt")["input_ids"]

Upvotes: 2

Related Questions