Reputation: 81
I am building a question answering chatbot powered by llms. I have seen in chatbots like bing chat it predicts what might be the top three next questions user may ask.
My question is: How would I do the same in my chatbot?
I have implemented the qa chatbot using langchain.
Methods I thought of:
Is there any other methods/tools for this task (i couldn't find any)?
Upvotes: 0
Views: 2985
Reputation: 2816
I tried below two options with and without history, I was able to predict next question successfully. I was just ensuring the role and context is set correctly. You can tweak your code and build chain where first chain provide answer based on domain knowledge (your actual use case question answer) and second chain predict next question
from langchain.chat_models import ChatOpenAI
from langchain.prompts.chat import (
ChatPromptTemplate,
SystemMessagePromptTemplate,
HumanMessagePromptTemplate,
)
from langchain.chains import LLMChain, ConversationChain
from langchain.memory import ConversationBufferMemory
from langchain.prompts import PromptTemplate
#Option 1 Without history
template = """You are a helpful assistant in predicting next question based on current question.
You always provide predicted question on new line"""
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
chain = LLMChain(
llm=ChatOpenAI(temperature=1),
prompt=chat_prompt,
verbose=True
)
print(chain.run("Is computer science right field?"))
#Option 2 With History
template = """You are a helpful assistant in predicting next question based on chat history.
Don't forget, you always provide predicted question on new line with Predicted Question prefix
Current conversation:
{history}
Human: {input}
AI Assistant:"""
PROMPT = PromptTemplate(input_variables=["history", "input"], template=template)
conversation = ConversationChain(
prompt=PROMPT,
llm=ChatOpenAI(model_name="gpt-3.5-turbo", temperature=1),
verbose=True,
memory=ConversationBufferMemory(ai_prefix="AI Assistant"),
)
print(conversation.predict(input="Is computer science good field ?"))
print(conversation.predict(input="Is computer science is complicated field ?"))
Upvotes: 1