Reputation: 23
lately I have been trying to use langchain in order to create a chat bot that gives answers based only on the given context. For some reason, I cannot find a parameter that takes in my consideration only my data. Sometimes it will give answers from somewhere else.
conversation = ConversationalRetrievalChain.from_llm(
llm=ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0),
retriever=vectorstore.as_retriever(search_kwargs={"k": 1}),
memory=self.memory
)
Upvotes: 0
Views: 624
Reputation: 2876
you can do it with explicitly specifying in prompt
# Create conversational chain
chain = ConversationalChain(
llm=chat_model,
retriever=vectorstore.retriever(k=1),
combine_docs_chain_kwargs={'prompt': 'You are an AI assistant who answers questions. You only know what is contained in the provided documents. Do not use any outside knowledge'}
)
Upvotes: 0