kslote1
kslote1

Reputation: 740

Langchain with Azure AttributeError: Can't get attribute 'Document' when importing langchain.schema in Python

I'm working on a Python project that uses the langchain library with Azure embeddings. However, I'm encountering an issue when trying to load the serialized Faiss index with the OpenAIEmbeddings when they are generated by Azure. Here's the error message I'm getting:

AttributeError: Can't get attribute 'Document' on <module 'langchain.schema' from '/Users/home/python3.10/site-packages/langchain/schema.py'>

Here is the full error:

ERROR:    Traceback (most recent call last):
  File "/Users/home/anaconda3/envs/chatbot2/lib/python3.10/site-packages/starlette/routing.py", line 671, in lifespan
    async with self.lifespan_context(app):
  File "/Users/home/anaconda3/envs/chatbot2/lib/python3.10/site-packages/starlette/routing.py", line 566, in __aenter__
    await self._router.startup()
  File "/Users/home/anaconda3/envs/chatbot2/lib/python3.10/site-packages/starlette/routing.py", line 648, in startup
    await handler()
  File "/Users/home/chat-langchain/./main.py", line 45, in startup_event
    vectorstore = FAISS.load_local("faiss_index", embeddings)
  File "/Users/home/python3.10/site-packages/langchain/vectorstores/faiss.py", line 297, in load_local
    docstore, index_to_docstore_id = pickle.load(f)
AttributeError: Can't get attribute 'Document' on <module 'langchain.schema' from '/Users/home/anaconda3/envs/chatbot2/lib/python3.10/site-packages/langchain/schema.py'>

I'm using Python 3.10 and the langchain library is installed in my Anaconda environment. I'm not sure what's causing this error or how to resolve it. I've tried reinstalling the langchain and open ai libraries, but the issue persists.

Here's the code snippet where the error occurs:

from langchain.vectorstores import VectorStore

vectorstore: Optional[VectorStore] = None


@app.on_event("startup")
async def startup_event():
    logging.info("loading vectorstore")
    if not Path("helpdesk_faiss_index").exists():
        raise ValueError("helpdesk_faiss_index does not exist, please run ingest.py first")
    embeddings = OpenAIEmbeddings()
    global vectorstore
    vectorstore = FAISS.load_local("faiss_index", embeddings)

Libraries used

openai==0.27.2
langchain==0.0.123
faiss-cpu==1.7.3

The error seems to occur because the OpenAI embeddings were generated with Azure. Any help or guidance would be greatly appreciated. Thank you!

Upvotes: 0

Views: 1418

Answers (1)

Ramprasad
Ramprasad

Reputation: 132

Here is the sample to run the sematic search using the langchain.

https://github.com/hwchase17/langchain/blob/master/docs/modules/indexes/vectorstores/getting_started.ipynb

Upvotes: -1

Related Questions