Reputation: 1
I am using llamaIndex and locally downloaded Mistral model (mistral-7b-instruct-v0.2.Q4_K_M.gguf). I have created the python binding for this model using "llama-cpp". On defining the agent using:
'''
worker1 = FunctionCallingAgentWorker.from_tools([query_engine_tool], llm=llm, verbose=True,allow_parallel_tool_calls=True,)
'''
But on executing it I am getting the following error:
AttributeError Traceback (most recent call last)
Cell In[43], line 2
1 # define Agent and agent service
----> 2 worker1 = FunctionCallingAgentWorker.from_tools(
3 [query_engine_tool], llm=llm, verbose=True,allow_parallel_tool_calls=True,
4 )
File ~\anaconda3\envs\multiagent\Lib\site-packages\llama_index\core\agent\function_calling\step.py:158, in FunctionCallingAgentWorker.from_tools(cls, tools, tool_retriever, llm, verbose, max_function_calls, callback_manager, system_prompt, prefix_messages, **kwargs)
154 prefix_messages = [ChatMessage(content=system_prompt, role="system")]
156 prefix_messages = prefix_messages or []
--> 158 return cls(
159 tools=tools,
160 tool_retriever=tool_retriever,
161 llm=llm,
162 prefix_messages=prefix_messages,
163 verbose=verbose,
164 max_function_calls=max_function_calls,
165 callback_manager=callback_manager,
166 **kwargs,
167 )
File ~\anaconda3\envs\multiagent\Lib\site-packages\llama_index\core\agent\function_calling\step.py:103, in FunctionCallingAgentWorker.__init__(self, tools, llm, prefix_messages, verbose, max_function_calls, callback_manager, tool_retriever, allow_parallel_tool_calls)
100 """Init params."""
101 if not llm.metadata.is_function_calling_model:
102 raise ValueError(
--> 103 f"Model name {llm.model} does not support function calling API. "
104 )
105 self._llm = llm
106 self._verbose = verbose
AttributeError: 'LlamaCPP' object has no attribute 'model'
I am doing the development using python. Any help will be highly appreciated.
Upvotes: 0
Views: 119