Reputation: 21
I have created a python app where I am storing some documents (pdf, url ) using langchain chroma db. I then retrieve this data and start ask questions and get response from a locally stored AI. I was able to add chat history when using ChatPromptTemplate.from_messages. On the below however I do not know where to add the chat history when using ChatPromptTemplate.from_template.
after_rag_template = """You are an respectful and honest assistant. You have to answer the user's \
questions using only the context provided to you. If you don't know the answer, \
just say you don't know. Don't try to make up an answer.:
{context}
Question: {question}
"""
after_rag_prompt = ChatPromptTemplate.from_template(after_rag_template)
after_rag_chain = (
{"context": retriever, "question": RunnablePassthrough()}
| after_rag_prompt
| model_local
| StrOutputParser()
)
content = after_rag_chain.invoke(prompt)
Upvotes: 1
Views: 336
Reputation: 640
There are a few pieces of documentation both internally on the LangChain site as well as other tutorial websites like DevNavigator you rely on to solve this, but its essentially done in a few simple steps:
Define Prompt Template: Create a template with placeholders for {chat_history}, {context}, and {question}. You can use the template class in the links I cited above.
Format Chat History: Convert a list of HumanMessage and AIMessage objects into a formatted string for insertion into the prompt.
Build RAG Chain: Use a retriever to fetch relevant context. Pass chat_history and question into the chain. Use a language model to generate responses.
Invoke the Chain: Provide the user’s query and the formatted chat history as inputs when calling the chain.
Manage Chat History Statefully: Use LangChain's ConversationBufferMemory to maintain chat history across interactions. Check out the documentation cited above to learn more.
Upvotes: 1