rahularyansharma
rahularyansharma

Reputation: 10765

Could not load Llama model from path: ./Models/llama-7b.ggmlv3.q2_K.bin. Received error Llama.__init__() got an unexpected keyword argument 'input'

from langchain.llms import LlamaCpp
from langchain import PromptTemplate, LLMChain
from langchain.callbacks.manager import CallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler

template = """Question: {question}

Answer: Let's work this out in a step by step way to be sure we have the right answer."""

prompt = PromptTemplate(template=template, input_variables=["question"])

callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])

llm = LlamaCpp(
                model_path="./Models/llama-7b.ggmlv3.q2_K.bin",
                input={"temperature": 0.75,
                       "max_length": 2000,
                       "top_p": 1},
                callback_manager=callback_manager,
                verbose=True,
                )

llm_chain = LLMChain(prompt=prompt, llm=llm)

current folder structure

(llm) C:\llm>python app1.py
C:\llm\lib\site-packages\langchain\utils\utils.py:155: UserWarning: WARNING! input is not default parameter.
                input was transferred to model_kwargs.
                Please confirm that input is what you intended.
  warnings.warn(
Exception ignored in: <function Llama.__del__ at 0x000001923B3AE680>
Traceback (most recent call last):
  File "C:\llm\lib\site-packages\llama_cpp\llama.py", line 1507, in __del__
    if self.model is not None:
AttributeError: 'Llama' object has no attribute 'model'
Traceback (most recent call last):
  File "C:\llm\app1.py", line 14, in <module>
    llm = LlamaCpp(
  File "C:\llm\lib\site-packages\langchain\load\serializable.py", line 74, in __init__
    super().__init__(**kwargs)
  File "pydantic\main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp
__root__
  Could not load Llama model from path: ./Models/llama-7b.ggmlv3.q2_K.bin. Received error Llama.__init__() got an unexpected keyword argument 'input' (type=value_error)

Upvotes: 1

Views: 7432

Answers (3)

Keran Nagargooje
Keran Nagargooje

Reputation: 1

pip install llama-cpp-python==0.1.65 --force-reinstall --upgrade --no-cache-dir

This is also working but need install another packages also like pip install llma-cpp-python

Upvotes: -1

Eren Kalinsazlioglu
Eren Kalinsazlioglu

Reputation: 21

The newest update of llama.cpp uses gguf file Bindings(formats).
Try one of the following:

  • Build your latest llama-cpp-python library with --force-reinstall --upgrade and use some reformatted gguf models (huggingface by the user "The bloke" for an example).
  • Build an older version of the llama.cpp <= 0.1.48

Upvotes: 1

Abinaya Shankar
Abinaya Shankar

Reputation: 26

You could try installing llama-cpp-python older version:

pip install llama-cpp-python==0.1.65 --force-reinstall --upgrade --no-cache-dir

This worked for me.

Upvotes: 1

Related Questions