Reputation: 11
i am trying to load LLama 2 model on my CPU using CTransformers and get this model_type not recognized issue. if you know how to solve it or can suggest another way to load the model on my CPU please let me know. thanks.
CODE: from langchain.llms import CTransformers
llm = CTransformers(model="C:\Users\yalik\Downloads\llama-2-7b-chat.ggmlv3.q4_0.bin", model_type="LLama") print(llm("hello LLama"))
error:
Model type 'LLama' is not supported. Traceback (most recent call last): File "D:\LLama-2-poc\p,py.py", line 3, in llm = CTransformers(model="C:\Users\yalik\Downloads\llama-2-7b-chat.ggmlv3.q4_0.bin", model_type="LLama") File "C:\Users\yalik\Desktop\lib\site-packages\langchain_core\load\serializable.py", line 97, in init super().init(**kwargs) File "C:\Users\yalik\Desktop\lib\site-packages\pydantic\v1\main.py", line 339, in init values, fields_set, validation_error = validate_model(pydantic_self.class, data) File "C:\Users\yalik\Desktop\lib\site-packages\pydantic\v1\main.py", line 1102, in validate_model values = validator(cls_, values) File "C:\Users\yalik\Desktop\lib\site-packages\langchain_community\llms\ctransformers.py", line 72, in validate_environment values["client"] = AutoModelForCausalLM.from_pretrained( File "C:\Users\yalik\Desktop\lib\site-packages\ctransformers\hub.py", line 175, in from_pretrained llm = LLM( File "C:\Users\yalik\Desktop\lib\site-packages\ctransformers\llm.py", line 253, in init raise RuntimeError( RuntimeError: Failed to create LLM 'LLama' from 'C:\Users\yalik\Downloads\llama-2-7b-chat.ggmlv3.q4_0.bin'.
Process finished with exit code 1
Upvotes: 1
Views: 251
Reputation: 11
I ran into a similar issue with the Llama model, and switching model_type to "llama" resolved it for me. It's worth giving that a try in your case as well!
from langchain.llms import CTransformers
llm = CTransformers(model="C:\Users\yalik\Downloads\llama-2-7b-chat.ggmlv3.q4_0.bin", model_type="llama") print(llm("hello LLama"))
Upvotes: 0