weifeng tripods
weifeng tripods

Reputation: 1

Customize prompt llamaindex

i have built chatbot using llamaindex to get response from a pdf, i want to add customize prompt also ,in which if the user messages is about booking appointment, then respond with "booknow!".

Here's my basic implementation

 upload_dir = 'uploads/machinebuilt'
    file_paths = [os.path.join(upload_dir, filename) for filename in os.listdir(upload_dir) if os.path.isfile(os.path.join(upload_dir, filename))]
    documents = SimpleDirectoryReader(input_files=file_paths).load_data()
    index=VectorStoreIndex.from_documents(documents)
    chat_engine= index.as_chat_engine(response_mode="compact",a_template=PromptTemplate(text_qa_template_str))
    response = chat_engine.chat(question)
    json_response = json.dumps({"response": response}, default=custom_serializer)
    response_dict = json.loads(json_response)
    final_response = response_dict['response']

how would i add prompt without distrubing the existing performance.?

My try, but the booking not working

question = request.json.get('question')

    qa_prompt_str = (
        "Context information is below.\n"
        "---------------------\n"
        "{context_str}\n"
        "---------------------\n"
        "Given the context information and not prior knowledge, "
        "answer the question: {query_str}\n"
    )

    refine_prompt_str = (
        "We have the opportunity to refine the original answer "
        "(only if needed) with some more context below.\n"
        "------------\n"
        "{context_msg}\n"
        "------------\n"
        "Given the new context, refine the original answer to better "
        "answer the question: {query_str}. "
        "If the question is about or related to booking an appointment, output the Appointment Answer \n"
        "Appointment Answer: booknow!"
    )

    chat_text_qa_msgs = [
        ChatMessage(
            role=MessageRole.SYSTEM,
            content=(
                "Always answer the question, even if the context isn't helpful."
            ),
        ),
        ChatMessage(role=MessageRole.USER, content=qa_prompt_str),
    ]

    text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)

    # Refine Prompt
    chat_refine_msgs = [
        ChatMessage(
            role=MessageRole.SYSTEM,
            content=(
                "Always answer the question, even if the context isn't helpful."
            ),
        ),
        ChatMessage(role=MessageRole.USER, content=refine_prompt_str),
    ]
    refine_template = ChatPromptTemplate(chat_refine_msgs)
   
   
    
    upload_dir = 'uploads/machinebuilt'
    file_paths = [os.path.join(upload_dir, filename) for filename in os.listdir(upload_dir) if os.path.isfile(os.path.join(upload_dir, filename))]
    documents = SimpleDirectoryReader(input_files=file_paths).load_data()
    index=VectorStoreIndex.from_documents(documents)
    chat_engine= index.as_chat_engine(response_mode="compact", text_qa_template=text_qa_template,refine_template=refine_template)
    response = chat_engine.chat(question)

Upvotes: 0

Views: 813

Answers (1)

amamr Nafir
amamr Nafir

Reputation: 1

it seems to me you're not specifying which LLM youre working with, other than that , i recommend adding the the part "If the question is about or related to booking an appointment, output the Appointment Answer \n" "Appointment Answer: booknow!"

in the qa_pormpt_str.

Upvotes: 0

Related Questions