Reputation: 189
I want to make a chatbot, that should answer questions from the context, in my case, a vector database. It is doing that perfectly. But I also want it to answer questions, which are not in the vector database. But it is unable to do so. It only is able to answer from the context.
This is the prompt template I have for this:
template = """Answer the question in your own words from the
context given to you.
If questions are asked where there is no relevant context available, please answer from
what you know.
Context: {context}
Chat history: {chat_history}
Human: {question}
Assistant:"""
My prompt is as follows:
prompt = PromptTemplate(
input_variables=["context", "chat_history", "question"], template=template
)
For the memory, I provided an initial question:
memory.save_context({"input": "Who is the founder of India?"},
{"output": "Gandhi"})
For the QA Retrieval, I am using the following code:
qa = RetrievalQA.from_chain_type(
llm=llm,
retriever=vectorstore.as_retriever(),
memory=memory,
chain_type_kwargs={'prompt': prompt}
)
But when I ask about a question:
question= "What did I ask about India?"
result = qa({"query": question})
It doesn't have any answer for that. Although this question is stored in the chat history. It is only able to answer questions from the vector database. I will greatly appreciate a help in this.
Upvotes: 2
Views: 6177
Reputation: 2876
Below is the code that stores history by default, if there is no answer in doc store, it will fetch result from llm.
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import Chroma
from langchain.text_splitter import CharacterTextSplitter
from langchain.llms import OpenAI
from langchain.chains import ConversationalRetrievalChain,RetrievalQA
from langchain.document_loaders import TextLoader
from langchain.memory import ConversationBufferMemory
from langchain.embeddings.sentence_transformer import SentenceTransformerEmbeddings
from langchain.prompts import PromptTemplate
loader = TextLoader("fatherofnation.txt")
documents = loader.load()
template = """Answer the question in your own words from the
context given to you.
If questions are asked where there is no relevant context available, please answer from
what you know.
Context: {context}
Human: {question}
Assistant:"""
prompt = PromptTemplate(
input_variables=["context", "question"], template=template)
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)
documents = text_splitter.split_documents(documents)
embedding_function = SentenceTransformerEmbeddings(model_name="all-MiniLM-L6-v2")
vectorstore = Chroma.from_documents(documents, embedding_function)
llm = "your llm model here"
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
memory.save_context({"input": "Who is the founder of India?"},
{"output": "Gandhi"})
qa = RetrievalQA.from_chain_type(llm, retriever=vectorstore.as_retriever(), memory=memory,chain_type_kwargs={'prompt': prompt}
)
# question = "Who is the father of India nation?"
# result = qa({"query": question})
# print(result)
question1= "What did I ask about India?"
result1 = qa({"query": question1})
print(result1)
question1= "Tell me about google in short ?"
result1 = qa({"query": question1})
print(result1)
Upvotes: 3