Saurabh Verma
Saurabh Verma

Reputation: 6728

Huggingface models - getting prediction from model stored locally

I'm able to run the following code successfully from Huggingface:

from transformers import pipeline, TFAutoModel
classifier = pipeline(task="text-classification", model="SamLowe/roberta-base-go_emotions", top_k=None)
sentences = ["This has been a good day"]
classifier(sentences) 

However, when trying to get the prediction from locally saved model using save_pretrained I'm getting unexpected errors:

classifier.save_pretrained("SamLowe/roberta-base-go_emotions")
model = TFAutoModel.from_pretrained("SamLowe/roberta-base-go_emotions")
model(sentences)

Error:

Data of type <class 'str'> is not allowed only (<class 'tensorflow.python.framework.tensor.Tensor'>, <class 'bool'>, <class 'int'>, <class 'transformers.utils.generic.ModelOutput'>, <class 'tuple'>, <class 'list'>, <class 'dict'>, <class 'numpy.ndarray'>) is accepted for input_ids.

Any idea what I might be doing wrong here ?

Upvotes: 0

Views: 141

Answers (1)

Poe Dator
Poe Dator

Reputation: 4913

when loading, you receive a transformers.models.roberta.modeling_roberta.RobertaModel object. It accepts input_ids, not strings in its forward() method.

To do what you want, use same syntax as you did when creating the original pipline:

classifier2 = pipeline(task="text-classification", model=PATH_TO_SAVED_MODEL, top_k=None)
classifier2(sentences)
> [{'label': 'joy', 'score': 0.7501430511474609},
>  {'label': 'admiration', 'score': 0.3446090519428253},
>  {'label': 'approval', 'score': 0.09101027995347977}, ...

Upvotes: 0

Related Questions