mahesh mj
mahesh mj

Reputation: 15

Issue in importing BERTtokenizer module for Q&A with finetuned BERT

I am trying to train the model for question answering with a finetuned Q&A BERT.

import torch
from transformers import BertForQuestionAnswering, BertTokenizer
model = BertForQuestionAnswering.from_pretrained('bert-large-uncased-whole-word-masking-finetuned-squad')

while i trying to use tokenizer for pretraining the bert-large-uncased-whole-word-masking-finetuned-squad model:I am getting the below error.

tokenizer = BertTokenizer.from_pretrained('bert-large-uncased-whole-word-masking-finetuned-squad')

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-29-d478833618be> in <module>
----> 1 tokenizer = BertTokenizer.from_pretrained('bert-large-uncased-whole-word-masking-finetuned-squad')

1 frames
/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils_base.py in _from_pretrained(cls, resolved_vocab_files, pretrained_model_name_or_path, init_configuration, use_auth_token, cache_dir, *init_inputs, **kwargs)
   1857     def _save_pretrained(
   1858         self,
-> 1859         save_directory: str,
   1860         file_names: Tuple[str],
   1861         legacy_format: bool = True,

ModuleNotFoundError: No module named 'transformers.models.auto.configuration_auto'

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
--------------------------------------------------------------------------

I am using the new version of transformer only in my notebook. But its giving me this error. Can someone help me with this issue?

Upvotes: 0

Views: 308

Answers (2)

Jos&#233; Ca&#241;ete
Jos&#233; Ca&#241;ete

Reputation: 11

Try with:

from transformers import AutoTokenizer, AutoModelForQuestionAnswering

tokenizer = AutoTokenizer.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad")

model = AutoModelForQuestionAnswering.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad")

Upvotes: 1

Jules Gagnon-Marchand
Jules Gagnon-Marchand

Reputation: 3801

I suspect that you have code from a previous version in your cache. Try

transformers.BertTokenizer.from_pretrained('bert-large-uncased-whole-word-masking-finetuned-squad', cache_dir="./")

Upvotes: 0

Related Questions