imclerran
imclerran

Reputation: 66

LangChain CSV agent / Pandas Dataframe agent returns json function call to user instead of executing it

Context

I am attempting to write a simple script to provide CSV data analysis to a user. I am using the CSV agent which is essentially a wrapper for the Pandas Dataframe agent, both of which are included in langchain-experimental. After initializing the the LLM and the agent (the csv agent is initialized with a csv file containing data from an online retailer), I run the agent with agent.run(user_message).

Expectation

The Agent should prompt the LLM using the openai function template, and the LLM will return a json result which which specifies the python repl tool, and the python code to be executed. This code should then be executed by the python repl, the result passed back to the LLM, and the LLM will respond with a natural language answer describing the result.

For example, from the documentation:

agent.run("how many rows are there?")
> Entering new  chain...

Invoking: `python_repl_ast` with `df.shape[0]`


891There are 891 rows in the dataframe.

> Finished chain.

Problem Statement

The agent begins execution and gets a json result describing the tool to use and the query to pass to the tool, but does not invoke the tool.

Problem Examples:

agent.run("How many rows are there?")

produces the result:

> Entering new AgentExecutor chain...
{
  "function": "python_repl_ast",
  "parameters": {
    "query": "len(df)"
  }
}

> Finished chain.

Full Code

import os
from langchain.chat_models import ChatOpenAI
from langchain.agents.agent_types import AgentType 
from langchain_experimental.agents.agent_toolkits import create_csv_agent

api_key = os.environ.get("OPENROUTER_API_KEY")
api_base = "https://openrouter.ai/api/v1"
model = "mistralai/mixtral-8x7b-instruct"
chat_model = ChatOpenAI(
    api_key=api_key,
    base_url=api_base,
    model=model,
    temperature=0.0,
)

def main():
    filepath = input("Enter the path to the CSV file: ")
    agent = create_csv_agent(
        chat_model,
        filepath,
        verbose=True,
        openai_model=chat_model,
        agent_type=AgentType.OPENAI_FUNCTIONS,
    )
    while True:
        user_message = input("You: ")
        lower_message = user_message.lower()
        if lower_message == "goodbye" or lower_message == "goodbye!":
            break
        response = agent.run(user_message)


if __name__ == "__main__":
    main()

Sources

Upvotes: 1

Views: 1752

Answers (1)

imclerran
imclerran

Reputation: 66

I've answered my own question: The problem is that the mistralai model has not been properly finetuned to use the openai functions format.

There are two solutions which will fix the issue:

  1. Change the model: model="openai/gpt-3.5-turbo"
    • Any Of the models listed here support OpenAI function calling
  2. Change the agent type: agent_type=AgentType.ZERO_SHOT_REACT_DESCRIPTION
    • To use a model which does not support OpenAI functions, use a different agent type. Any of the other agent types listed here will work.

Upvotes: 1

Related Questions