CaptainAmerica
CaptainAmerica

Reputation: 81

Predicting next questions in llm powered chatbot

I am building a question answering chatbot powered by llms. I have seen in chatbots like bing chat it predicts what might be the top three next questions user may ask.

My question is: How would I do the same in my chatbot?

I have implemented the qa chatbot using langchain.

Methods I thought of:

  1. Prompting the llm with history (both user question and bot reply) and then a line stating it to predict next question. This gives very vague questions and lenghty ones.
  2. Finetuning a basic model like gpt2 to predict next question. Dataset to finetune can be created using chatgpt.

Is there any other methods/tools for this task (i couldn't find any)?

Upvotes: 0

Views: 2985

Answers (1)

ZKS
ZKS

Reputation: 2816

I tried below two options with and without history, I was able to predict next question successfully. I was just ensuring the role and context is set correctly. You can tweak your code and build chain where first chain provide answer based on domain knowledge (your actual use case question answer) and second chain predict next question

    from langchain.chat_models import ChatOpenAI
    from langchain.prompts.chat import (
        ChatPromptTemplate,
        SystemMessagePromptTemplate,
        HumanMessagePromptTemplate,
    )
    from langchain.chains import LLMChain, ConversationChain
    from langchain.memory import ConversationBufferMemory
    from langchain.prompts import PromptTemplate


    #Option 1 Without history
    template = """You are a helpful assistant in predicting next question based on current question. 
                  You always provide predicted question on new line"""
    system_message_prompt = SystemMessagePromptTemplate.from_template(template)
    human_template = "{text}"
    human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

    chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
    chain = LLMChain(
        llm=ChatOpenAI(temperature=1),
        prompt=chat_prompt,
        verbose=True
    )
    print(chain.run("Is computer science right field?"))

    #Option 2 With History
    template = """You are a helpful assistant in predicting next question based on chat history. 
                  Don't forget, you always provide predicted question on new line with Predicted Question prefix

    Current conversation:
    {history}
    Human: {input}
    AI Assistant:"""
    PROMPT = PromptTemplate(input_variables=["history", "input"], template=template)
    conversation = ConversationChain(
        prompt=PROMPT,
        llm=ChatOpenAI(model_name="gpt-3.5-turbo", temperature=1),
        verbose=True,
        memory=ConversationBufferMemory(ai_prefix="AI Assistant"),
    )

    print(conversation.predict(input="Is computer science good field ?"))
    print(conversation.predict(input="Is computer science is complicated field ?"))

Upvotes: 1

Related Questions