Reputation: 1
2days ago, this code worked. but it don't work now.
I don't know why this code is not working
Please write in details how to work normally this code. (Please)
here is my code and Error. (I used huggingface Transformer API)
Data => Tensor (I used tf.data.Dataset.from_tensor_slices) import tensorflow as tf
with tf.device('/device:GPU:0'):
train_dataset = tf.data.Dataset.from_tensor_slices((
dict(train_tokenizer),
train_target
))
val_dataset = tf.data.Dataset.from_tensor_slices((
dict(val_tokenizer),
val_target
))
test_dataset = tf.data.Dataset.from_tensor_slices((
dict(test_tokenizer),
test_target
))
Trainer Code
for epoch in range(5,11):
training_args = TFTrainingArguments(
...
)
def get_model():
with training_args.strategy.scope():
config = ...
model = TFBertForSequenceClassification.from_pretrained(BERT_MODEL, config=config, from_pt=True)
return model
model = get_model()
def compute_metrics(eval_preds):
metric = load_metric("glue", "mrpc")
logits, labels = eval_preds
predictions = np.argmax(logits, axis=-1)
return metric.compute(predictions=predictions, references=labels)
trainer = TFTrainer(
model= model,
args=training_args,
train_dataset=train_dataset,
eval_dataset=val_dataset,
compute_metrics = compute_metrics
trainer.train()
trainer.evaluate()
Error Code
in user code:
File "/usr/local/lib/python3.7/dist-packages/transformers/models/bert/modeling_tf_bert.py", line 1455, in call *
loss = None if inputs["labels"] is None else self.compute_loss(labels=inputs["labels"], logits=logits)
TypeError: compute_loss() got an unexpected keyword argument 'labels'
Call arguments received:
• input_ids={'input_ids': 'tf.Tensor(shape=(16, 128), dtype=int32)', 'token_type_ids': 'tf.Tensor(shape=(16, 128), dtype=int32)', 'attention_mask': 'tf.Tensor(shape=(16, 128), dtype=int32)'}
• attention_mask=None
• token_type_ids=None
• position_ids=None
• head_mask=None
• inputs_embeds=None
• output_attentions=None
• output_hidden_states=None
• return_dict=None
• labels=tf.Tensor(shape=(16,), dtype=int32)
• training=True
• kwargs=<class 'inspect._empty'>
Upvotes: 0
Views: 1037
Reputation: 11
It seems to be a bug in transformers==4.15.0
at least.
I fixed it by upgrading to transformers==4.16.2
.
Upvotes: 1