DPM
DPM

Reputation: 935

How do create_history_aware_retriever and RunnableWithMessageHistory interact when used together?

I am building a chatbot, following the Conversational RAG example in langchain's documentation: https://python.langchain.com/v0.2/docs/tutorials/qa_chat_history/

So far I was able to create the bot with chat history, exactly like the example, only with my own model, and retriever:

retriever = load_embeddings()
llm = load_llm()
history_aware_retriever = contextualize_llm_with_chat_history(model=llm, embeddings_retriever=retriever)
rag_chain = create_qa_llm_chain(chat_history_chain=history_aware_retriever, model=llm)

It is when I introduce the RunnableWithMessageHistory that I encounter an error only when I use the stream invocation of the model:

def get_session_history(session_id: str) -> BaseChatMessageHistory:
   if session_id not in store:
      store[session_id] = ChatMessageHistory()
   return store[session_id]

store = {}

conversational_rag_chain = RunnableWithMessageHistory(
    rag_chain,
    get_session_history,
    input_messages_key="input",
    history_messages_key="chat_history",
    output_messages_key="answer",
)

If I use conversational_rag_chain.invoke() I receive the information I expect. However, when I choose to stream it:

for chunk in conversational_rag_chain.stream({"input": question}, config={"configurable": {"session_id": "abc123"}}):
    print(chunk)

Before the streaming starts I can see the following in the console:

Error in RootListenersTracer.on_chain_end callback: KeyError('answer')
Error in callback coroutine: KeyError('answer')

I do not understand what this error means, since the stream starts as expected right after those two messages.

I can see I am not the only one with this issue: https://github.com/langchain-ai/langchain/issues/24713

Upvotes: 1

Views: 1705

Answers (1)

SmittySmee
SmittySmee

Reputation: 400

Please verify that your python code is valid. The formatting appears to be wrong from what you posted. Would you please also include the prompt showing the associated keys for input of the prompt?

I found this article to be beneficial. Similar to what you are doing, this is tying it together, I was able to swap the invoke for stream without error.

https://python.langchain.com/v0.1/docs/use_cases/question_answering/chat_history/#tying-it-together

Upvotes: 0

Related Questions