philip hess
philip hess

Reputation: 1

Returning source documents with langchain conversationalretrievalchain.from_llm

I'm trying to return source documents using Langchain's conversationalretrievalchain.from_llm but keep getting error that the object expecting exactly one output key, but it's getting two: 'answer' and 'source_documents'.

I have looked around at other stack overflow posts and langchain docs (which are a little confusing) and I think I may be using the class incorrectly. Anyways, here is the code:

vectorstore = Pinecone(
    index, embeddings.embed_query, text_field
)

def chat(user_id):
    user_message = request.form.get('message')
    
    # Load the conversation history from session
    conversation_history = session.get('conversation_history_{user_id}', [])
    
    bot_temperature = get_bot_temperature(user_id)
    custom_prompt = get_custom_prompt(user_id)

    # Initialize the chatbot with the bot_temperature
    llm = ChatOpenAI(
        openai_api_key=openai_api_key,
        model_name='gpt-3.5-turbo',
        temperature=bot_temperature
    )

    # Define the prompt template with placeholders for context and chat history
    prompt_template = f"""
        {custom_prompt}

        CONTEXT: {{context}}

        QUESTION: {{question}}"""
    
        # Create a PromptTemplate object with input variables for context and chat history
    TEST_PROMPT = PromptTemplate(input_variables=["context", "question"], template=prompt_template)

    # Create a ConversationBufferMemory object to store the chat history
    memory = ConversationBufferWindowMemory(memory_key="chat_history", return_messages=True, k=8)

    # Create a ConversationalRetrievalChain object with the modified prompt template and chat history memory
    conversation_chain = ConversationalRetrievalChain.from_llm(
            llm=llm,
            retriever=vectorstore.as_retriever(search_kwargs={'filter': {'user_id': f"{user_id}"}}),
            memory=memory,
            combine_docs_chain_kwargs={"prompt": TEST_PROMPT},
            return_source_documents=True
        )
    # Handle the user input and get the response
    response = conversation_chain.run({'question': user_message})
    source_document = response['source_documents'][0]
    print(f"Source document: {source_document}")
    # Save the user message and bot response to session
    conversation_history.append({'input': user_message, 'output': response})
    session['conversation_history'] = conversation_history
    
    # print(f"User: {user_message} | Bot:{response}")  # This will print the conversation history
    print(conversation_history)
    print(session)
    print("*"*100)
    
    return jsonify(response=response)

I have tried grab the value in the dict of source docs and also use the method invoke instead of run and it still isn't working. Here is the error I am getting:

Traceback (most recent call last): File "/opt/homebrew/lib/python3.11/site-packages/flask/app.py", line 1455, in wsgi_app response = self.full_dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/flask/app.py", line 869, in full_dispatch_request rv = self.handle_user_exception(e) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/flask/app.py", line 867, in full_dispatch_request rv = self.dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/flask/app.py", line 852, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/philiphess_1/Desktop/Coding/HR_bot/hr_bot_demo/app.py", line 334, in chat response = conversation_chain.run({'question': user_message}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/langchain/chains/base.py", line 500, in run _output_key = self._run_output_key ^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/langchain/chains/base.py", line 449, in _run_output_key raise ValueError( ValueError: run not supported when there is not exactly one output key. Got ['answer', 'source_documents'].

Upvotes: 0

Views: 1270

Answers (1)

I had a similar issue a few weeks ago with the same chain but with a ConversationBufferMemory. I just added the input_key and output_key as they are below and it worked.

memory = ConversationBufferMemory(
        memory_key="chat_history",
        input_key="question",
        output_key="answer",
        return_messages=True,
        chat_memory=message_manager,
)

Upvotes: 0

Related Questions