Lukas
Lukas

Reputation: 61

Getting started: Huggingface Model Cards

I just recently started looking into the huggingface transformer library. When I tried to get started using the model card code at e.g. community model

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("emilyalsentzer/Bio_ClinicalBERT")
model = AutoModel.from_pretrained("emilyalsentzer/Bio_ClinicalBERT")

However, I got the following error:

Traceback (most recent call last):
  File "test.py", line 2, in <module>
    tokenizer = AutoTokenizer.from_pretrained("emilyalsentzer/Bio_ClinicalBERT")
  File "/Users/Lukas/miniconda3/envs/nlp/lib/python3.7/site-packages/transformers/tokenization_auto.py", line 124, in from_pretrained
    "'xlm', 'roberta', 'ctrl'".format(pretrained_model_name_or_path))
ValueError: Unrecognized model identifier in emilyalsentzer/Bio_ClinicalBERT. Should contains one of 'bert', 'openai-gpt', 'gpt2', 'transfo-xl', 'xlnet', 'xlm', 'roberta', 'ctrl'

If I try a different tokenizer such as "baykenney/bert-base-gpt2detector-topp92" I get the following error:

OSError: Model name 'baykenney/bert-base-gpt2detector-topp92' was not found in tokenizers model name list (bert-base-uncased, bert-large-uncased, bert-base-cased, bert-large-cased, bert-base-multilingual-uncased, bert-base-multilingual-cased, bert-base-chinese, bert-base-german-cased, bert-large-uncased-whole-word-masking, bert-large-cased-whole-word-masking, bert-large-uncased-whole-word-masking-finetuned-squad, bert-large-cased-whole-word-masking-finetuned-squad, bert-base-cased-finetuned-mrpc, bert-base-german-dbmdz-cased, bert-base-german-dbmdz-uncased). We assumed 'baykenney/bert-base-gpt2detector-topp92' was a path or url to a directory containing vocabulary files named ['vocab.txt'] but couldn't find such vocabulary files at this path or url.

Did I miss anything to get started? I feel like the model cards indicate that these three lines of code should should be enough to get started.

I am using Python 3.7 and the transformer library version 2.1.1 and pytorch 1.5.

Upvotes: 0

Views: 1407

Answers (1)

cronoik
cronoik

Reputation: 19495

Please update your transformers library to at least 2.4.0. You should create a new conda environment and install all your packages directly from pypi with pip to get the most recent version (currently 2.11.0).

Upvotes: 2

Related Questions