Shushan
Shushan

Reputation: 31

Cannot import pipeline after successful transformers installation

Environment info

Details

I am attempting to use a fresh installation of transformers library, but after successfully completing the installation with pip, I am not able to run the test script: python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('we love you'))"

Instead, I see the following output:

/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/gensim/similarities/init.py:15: UserWarning: The gensim.similarities.levenshtein submodule is disabled, because the optional Levenshtein package <https://pypi.org/proje$ t/python-Levenshtein/> is unavailable. Install Levenhstein (e.g. pip install python-Levenshtein) to suppress this warning.
warnings.warn(msg)
Traceback (most recent call last):
File "", line 1, in
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/file_utils.py", line 1977, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/file_utils.py", line 1986, in _get_module
return importlib.import_module("." + module_name, self.name)
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/pipelines/init.py", line 25, in
from ..models.auto.configuration_auto import AutoConfig
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/models/init.py", line 19, in
from . import (
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/models/layoutlm/init.py", line 22, in
from .configuration_layoutlm import LAYOUTLM_PRETRAINED_CONFIG_ARCHIVE_MAP, LayoutLMConfig
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/models/layoutlm/configuration_layoutlm.py", line 19, in
from ..bert.configuration_bert import BertConfig
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/models/bert/configuration_bert.py", line 21, in
from ...onnx import OnnxConfig
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/onnx/init.py", line 16, in
from .config import EXTERNAL_DATA_FORMAT_SIZE_LIMIT, OnnxConfig, OnnxConfigWithPast
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/onnx/config.py", line 18, in
from transformers import PretrainedConfig, PreTrainedTokenizer, TensorType
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/file_utils.py", line 1977, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/file_utils.py", line 1986, in _get_module
return importlib.import_module("." + module_name, self.name)
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/tokenization_utils.py", line 26, in
from .tokenization_utils_base import (
File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 74, in
from tokenizers import AddedToken
File "/home/shushan/tokenization_experiments/tokenizers.py", line 26, in
from transformers import BertTokenizer File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/file_utils.py", line 1978, in getattr value = getattr(module, name) File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/file_utils.py", line 1977, in getattr module = self._get_module(self._class_to_module[name]) File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/file_utils.py", line 1986, in _get_module return importlib.import_module("." + module_name, self.name) File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformers/models/bert/tokenization_bert.py", line 23, in from ...tokenization_utils import PreTrainedTokenizer, _is_control, _is_punctuation, _is_whitespace ImportError: cannot import name 'PreTrainedTokenizer' from partially initialized module 'transformers.tokenization_utils' (most likely due to a circular import) (/home/shushan/.conda/envs/ccg_parser/lib/python3.9/site-packages/transformer s/tokenization_utils.py)

I have attempted uninstalling transformers and re-installing them, but I couldn't find any more information as to what is wrong, or how to go about fixing this issue I am seeing. The only suspicious behavior is that the output of transformers-cli env command above specifies the pytorch version does not work with GPU, while in reality, I have an installation of pytorch that works with gpu. Can you help? Thanks in advance Shushan

Upvotes: 0

Views: 7698

Answers (1)

kkgarg
kkgarg

Reputation: 1376

Maybe presence of both Pytorch and TensorFlow or maybe incorrect creation of the environment is causing the issue. Try re-creating the environment while installing bare minimum packages and just keep one of Pytorch or TensorFlow.

It worked perfectly fine for me with the following config:

 - transformers version: 4.9.0
 - Platform: macOS-10.14.6-x86_64-i386-64bit
 - Python version: 3.9.2 
 - PyTorch version (GPU?): 1.7.1 (False) 
 - Tensorflow version (GPU?): not installed (NA) 
 - Flax version (CPU?/GPU?/TPU?): not installed (NA) 
 - Jax version: not installed 
 - JaxLib version: not installed 
 - Using GPU in script?: <fill in> 
 - Using distributed or parallel set-up in script?: <fill in>

Upvotes: 1

Related Questions