Nina
Nina

Reputation: 1

My Flask application (uses OpenAI API and Langchain) runs locally, but when deployed using Azure it doesn't use the text file to answer my query

I have created an application that uses Langchain and ChatGPT API to be able to query data from a text file 'text.txt'.

from flask import Flask, render_template, request
import os
import langchain
from langchain.chains import ConversationalRetrievalChain
from langchain.chat_models import ChatOpenAI
from langchain.document_loaders import DirectoryLoader, TextLoader
from langchain.embeddings import OpenAIEmbeddings
from langchain.indexes import VectorstoreIndexCreator
from langchain.indexes.vectorstore import VectorStoreIndexWrapper
from langchain.indexes.vectorstore import Chroma
import myconstant

# Set the OPENAI API Key
os.environ["OPENAI_API_KEY"] = myconstant.APIKEY

# Configure whether to persist the index
PERSIST = True

app = Flask(__name__)

query = None
chat_history = []

@app.route('/')
@app.route('/index')
def index():
    return render_template('index.html', query=query)

@app.route('/', methods=['POST'])
def process_query():
    global query
    global chat_history

    query = request.form['query']

    if query in ['quit', 'q', 'exit']:
        return "Goodbye!"

    # Load index from persistence or create a new one
    if PERSIST and os.path.exists("persist"):
        vectorstore = Chroma(persist_directory="persist", embedding_function=OpenAIEmbeddings())
        index = VectorStoreIndexWrapper(vectorstore=vectorstore)
    else:
        loader = TextLoader("test.txt", encoding="utf-8")

        if PERSIST:
            # Create an index with persistence
            index = VectorstoreIndexCreator(vectorstore_kwargs={"persist_directory": "persist"}).from_loaders([loader])
        else:
            # Create an index without persistence
            index = VectorstoreIndexCreator().from_loaders([loader])

    # Create a conversational retrieval chain using the specified LLM model and index retriever
    chain = ConversationalRetrievalChain.from_llm(
        llm=ChatOpenAI(model="gpt-3.5-turbo"),
        retriever=index.vectorstore.as_retriever(search_kwargs={"k": 1}),
    )

    # Generate an answer using the conversational chain
    result = chain({"question": query, "chat_history": chat_history})

    # Add the query and answer to chat history
    chat_history.append((query, result['answer']))

    return render_template('index.html', query=query, result=result['answer'])

if __name__ == '__main__':
    app.run(debug=True)

When I run this Flask app locally it answers my questions using the text file, and therefore gives me the intended response.

However, the same app deployed using Azure app service will run, but when I ask the same question it gives a generic response that doesn't use my text file.

For example, when I say 'Give an example of working with [client]' the local flask app says "Yes, we collaborated with [client] in 2018, when we worked on [project].

The Azure app says 'I don't have access to specific partnerships and clients. It would be best to check [client]'s website for more detail about this'.

I have tried this with multiple different questions and the same issue prevails.

Upvotes: 0

Views: 398

Answers (1)

Mina
Mina

Reputation: 71

Did you verify that the text file exists on Azure?

import os.path
os.path.isfile("path/to/your/test.txt")

You could also try to print the content of the file or send it as a response on a temporary debug GET endpoint on your Flask application.

& did you check the permissions for the text file? Example for checking read permissions:

import os
os.access("path/to/your/test.txt", os.R_OK)

Upvotes: 0

Related Questions