Reputation: 25
I am trying to create an LLM that I can use on pdfs and that can be used via an API (external chatbot). I have so far used Langchain with the OpenAI (with 'text-davinci-003') apis and Chromadb and got it to work. The next step that got me stuck is how to make that available via an api so my external chatbot can access it. I have tried with vllm and OpenLLM but am stuck in equal measure.
Any help/pointers would be greatly appreciated! Many thanks!
Here's my code so far:
from langchain import OpenAI, VectorDBQA
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import Chroma
from langchain.text_splitter import CharacterTextSplitter
from langchain.document_loaders import TextLoader
from langchain.prompts import PromptTemplate
from langchain.llms import VLLM
OPENAI_API_KEY = ‘XXXXXX’
llm = OpenAI(model_name='text-davinci-003', temperature=0, openai_api_key=OPENAI_API_KEY)
from langchain.document_loaders import PyPDFLoader
loader = PyPDFLoader("https://Some_doc.pdf")
pages = loader.load_and_split()
embeddings = OpenAIEmbeddings(openai_api_key=OPENAI_API_KEY)
docsearch = Chroma.from_documents(pages, embeddings)
llm = OpenAI(model_name='text-davinci-003', temperature=0, openai_api_key=OPENAI_API_KEY)
qa_chain = VectorDBQA.from_chain_type(llm=llm, chain_type='stuff', vectorstore=docsearch)
qa_chain.run(‘some question…’)
ps: I am using Jupyter Notebooks in Azure with CPU compute (don;'t have GPUs in my subscriptions so that limits some models too)
Upvotes: 1
Views: 1667
Reputation: 2816
You can create FastAPI application
pip install uvicorn
#Sample Code (Assuming your FastAPI application named app in a file called main.py
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def process_pdf_docs():
return {"Hello": "World"} #You code here
#Running to app
uvicorn main:app --reload
Upvotes: 0