Reputation: 1
I am using Huggingface inference APIs for a basic GenAI applciation using Llama 3.2 & mistral. While calling the APIs i am getting the below error:
(MaxRetryError("HTTPSConnectionPool(host='api-inference.huggingface.co', port=443): Max retries exceeded with url: /models/mistralai/Mistral-7B-Instruct-v0.3/v1/chat/completions (Caused by SSLError(SSLCertVerificationError(1, '\[SSL: CERTIFICATE_VERIFY_FAILED\] certificate verify failed: unable to get local issuer certificate (\_ssl.c:1000)')))"), '(Request ID: b4e71615-6cbc-4e46-9315-7403f74e398d)')
The same error persists for Llama 3.2 model also.
The relevant code snippet is as below:
@staticmethod
def retLLMModel():
llm = HuggingFaceEndpoint(
repo_id="mistralai/Mistral-7B-Instruct-v0.2",
# repo_id="codellama/CodeLlama-34b-Instruct-hf",
max_new_tokens=1500,
top_k=10,
top_p=0.95,
typical_p=0.95,
temperature=0.01,
repetition_penalty=1.03,
huggingfacehub_api_token=os.getenv("HUGGING_FACE_KEY")
)
return llm
def retAgentExecutor(self,template,tools_list,llm):
base_prompt=self.cqa_obj.get_hub()
react_prompt = base_prompt.partial(instructions = template)
agent =create_react_agent(llm=llm,tools=tools_list,prompt=react_prompt)
agent_executor=AgentExecutor(agent=agent,tools=tools_list,verbose=False,max_iterations=10,return_intermediate_steps=True,handle_parsing_errors=True)
return agent_executor
def executing_function_name(state: GraphState):
## giving executing function relevant code part
llm = Utils.retLLMModel()
try:
template = prompts_custom.SeqChainPrompt.get_react_user_query_template()
template = template.format(user = messages_[0][1],columns=columns,dataframe_path=dataframe_path,python_tool=python_tool)
agent_executor = utils_obj.retAgentExecutor(template,tools_list,llm)
result = agent_executor.invoke(input={"input":messages_[0][1]})
code = result['output'].strip("```</s>")
if len(code.split("```"))>1:
code = code.split("```")[1].replace("python","")
print("result = ", result)
except Exception as e:
error.append(f"yes : {e}" )
#print(e)
#print("error_code 1 = ", error)
return {"generation":" ",
"error":error}
I tried updating the requests library & also set the REQUESTS_CA_BUNDLE environment variable correctly and confirmed that certifi is installed. What can i do to resolve this ?
Upvotes: 0
Views: 33