Luis Leal
Luis Leal

Reputation: 3524

How to pass multiple inputs during invoke to a MessageGraph?

We have a MessageGraph for an LLMCompiler implementation and as expected we pass a users's question when running invoke as a list of HumanMessage objects (which are mapped to a default "messages" key and passed to prompt templates), this works fine for simple use cases but now we need to pass an additional piece of information/context at invoke time (not at graph building time), we did something similar with a react agent and it was very easy passing a dictionary similar to invoke({"messages":input, "context":context}). But for MessageGraph this did not work, it looks as the list of messages passed when you run invoke(messages) is automatically mapped in the prompt to the "messages" key and no other inputs can be added, I tried passing a dicionary invoke({"messages":messages, "context":context}) and it did not work, failed with error:

Message dict must contain 'role' and 'content' keys, got {"messages":messages,"context":context}

Upvotes: 0

Views: 150

Answers (1)

biplob biswas
biplob biswas

Reputation: 104

'messages' have to be a list of message of type HumanMessage/AIMessage/SystemMessage/ToolMessage. You can prepend the context (assuming it's a string) as a SystemMessage to your existing message list and then invoke.

llm.invoke([SystemMessage(content=context)] + existing_messages)

Upvotes: 1

Related Questions