Enoch
Enoch

Reputation: 33

why message content is alway empty when I create my own tool with retriever in langchain?

I create a simpler retriever based on docs, after I create a tool and I want to bind it in LLM object, but when I make a question it return empty content.

This is the code:

llm = ChatOpenAI(temperature=0, model="gpt-4o",api_key = api_key)
embeddings = OpenAIEmbeddings()
db = FAISS.from_documents(docs, embeddings)
retriever = db.as_retriever(search_kwargs={"k":4})
tool = create_retriever_tool(
  retriever,
  name="custom_tool",
  description="a tool for retrieve information",
)
llm_with_tools = llm.bind_tools([tool])
msg = llm_with_tools.invoke("Who are Steve Jobs? ")

msg contains :

AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_l1eN4QPHf2LBgC4BGZ95NgJt', 'function': {'arguments': '{"query":"Steve Jobs"}', 'name': 'wiki_tool'}, 'type': 'function'}]}, response_metadata={'token_usage': {'completion_tokens': 16, 'prompt_tokens': 58, 'total_tokens': 74}, 'model_name': 'gpt-4o-2024-05-13', 'system_fingerprint': 'fp_d576307f90', 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-675e9ea1-b5f7-41f5-873a-f23142132243-0', tool_calls=[{'name': 'custom_tool', 'args': {'query': 'Steve Jobs'}, 'id': 'call_l1eN4QPHf2LBgC4BGZ95NgJt'}], usage_metadata={'input_tokens': 58, 'output_tokens': 16, 'total_tokens': 74})

what is the problem?

Thank you

Upvotes: 2

Views: 1090

Answers (1)

Harsh
Harsh

Reputation: 53

By

llm_with_tools = llm.bind_tools([tool])

You are just passing your function/tool schema to LLM.

In response to user query LLM just extract/return function call and argument to custom function.

Again you have to prompt LLM with function call with arguments, which is given by LLM in previous call.

Refer this example by langchain,

from langchain_core.messages import HumanMessage, ToolMessage

@tool
def add(a: int, b: int) -> int:
    """Adds a and b.

    Args:
        a: first int
        b: second int
    """
    return a + b


@tool
def multiply(a: int, b: int) -> int:
    """Multiplies a and b.

    Args:
        a: first int
        b: second int
    """
    return a * b


tools = [add, multiply]
#binding function schema to LLM
llm_with_tools = llm.bind_tools(tools)

messages = [HumanMessage(query)]
#asking user query in a new request/prompt
ai_msg = llm_with_tools.invoke(messages)
messages.append(ai_msg)

for tool_call in ai_msg.tool_calls:
    selected_tool = {"add": add, "multiply": multiply}[tool_call["name"].lower()]
    #asking LLM to generate function call with arguments
    tool_output = selected_tool.invoke(tool_call["args"])
    messages.append(ToolMessage(tool_output, tool_call_id=tool_call["id"]))

#finally prompting LLM with custom function schema and all required arguments
llm_with_tools.invoke(messages)

Upvotes: 1

Related Questions