jmelm93
jmelm93

Reputation: 133

LangChain/LangGraph - TypeError: create_react_agent() got an unexpected keyword argument 'response_format'

I don't get why this doesn't work. It's the code from the documentation, which clearly states that create_react_agent has a response_format option, but it returns an error of:

TypeError: create_react_agent() got an unexpected keyword argument 'response_format'

Anyone know what's going on here? Any help would be appreciated.

The documentation I used to build this is shown within the code itself below.

# https://langchain-ai.github.io/langgraph/how-tos/create-react-agent-structured-output/

from pydantic import BaseModel, Field
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
from typing import Literal
from dotenv import load_dotenv

load_dotenv()

model = ChatOpenAI(model="gpt-4o")

class WeatherResponse(BaseModel):
    """Respond to the user in this format."""
    conditions: str = Field(description="Weather conditions")

@tool
def get_weather(city: Literal["nyc", "sf"]):
    """Use this to get weather information."""
    if city == "nyc":
        return "It might be cloudy in nyc"
    elif city == "sf":
        return "It's always sunny in sf"
    else:
        raise AssertionError("Unknown city")

system_prompt = "Get either the weather in NYC or SF."

tools = [get_weather]

graph = create_react_agent(
    # https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent
    model,
    tools=tools,
    # response_format (Optional[Union[StructuredResponseSchema, tuple[str, StructuredResponseSchema]]], default: None ) – An optional schema for the final agent output.
    response_format=WeatherResponse,
    # state_modifier (Optional[StateModifier], default: None ) – An optional state modifier. This takes full graph state BEFORE the LLM is called and prepares the input to LLM.
    state_modifier=system_prompt
)

inputs = {"messages": [("user", "What's the weather in NYC?")]}
response = graph.invoke(inputs)

print('response:', response)

Upvotes: 0

Views: 194

Answers (1)

The response_format parameter was introduced in Langgraph version 0.2.62. Try updating with: pip install -U langgraph.

Upvotes: 0

Related Questions