Reputation: 1214
Following through the Huggingface quantization guide, I installed the following:
pip install transformers accelerate bitsandbytes
(It yielded transformers 4.26.0, accelerate 0.16.0, bitsandbytes 0.37.0, which seems to match the guide’s requirements.)
Then ran the first line of the offload code in Python:
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
It however resulted in the following error: ImportError: cannot import name 'BitsAndBytesConfig' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/__init__.py)
.
Doing grep BitsAndBytesConfig -r /usr/local/lib/python3.10/dist-packages
yields nothing.
Is there a step I might have skipped, or a version inconsistency I could work around?
Upvotes: 5
Views: 24541
Reputation: 11
Change collab environment to GPU necessary and run this :
!pip install git+https://github.com/huggingface/accelerate.git
!pip install git+https://github.com/huggingface/transformers.git
!pip install bitsandbytes
Upvotes: 1
Reputation: 391
Install from the original github repository
pip install git+https://github.com/huggingface/transformers
from transformers import BitsAndBytesConfig
Upvotes: 0
Reputation: 1214
BitsAndBytesConfig was added only recently, and the latest release dates back to earlier.
The online documentation is generated from the source’s mdx, so it sometimes references things that are not yet released. However, it can be tried by installing from source:
pip install git+https://github.com/huggingface/transformers
Upvotes: 7