Reputation: 1
I am a beginner working with LLMs. I am trying to create a RAG system to work with multiple knowledge graphs. My code is working when I query one graph database, but I need to extend it to work with multiple Neo4j databases. In other words, I want the LLM to
Any ideas or hints on how to approach this?
Thank you.
What I've Done So Far? Here’s a part of my current code that works with a single database:
from langchain_community.graphs import Neo4jGraph
from langchain.prompts.prompt import PromptTemplate
from langchain.chains import GraphCypherQAChain
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
openai_api_key=api_key,temperature=0,model='gpt-4'
)
kg = Neo4jGraph(
url=uri,
username=username,
password=password
)
CYPHER_GENERATION_TEMPLATE = upload_cypher_prompt() #a function for uploading Query prompt
kg.refresh_schema()
CYPHER_GENERATION_PROMPT = PromptTemplate(
input_variables=["schema"],
template=CYPHER_GENERATION_TEMPLATE
)
QA_PROMPT_TEMPLATE = upload_qa_prompt() #a function for uploading QA prompt
QA_PROMPT = PromptTemplate(
input_variables=["context", "question"],
template=QA_PROMPT_TEMPLATE
)
question= upload_question() #a function for uploading question
cypherChain = GraphCypherQAChain.from_llm(
llm,
graph=kg,
verbose=True,
cypher_prompt=CYPHER_GENERATION_PROMPT,
qa_prompt=QA_PROMPT
)
input_variables = {
"query": question
}
response = cypherChain.invoke(input_variables)
print(response['result'])
Upvotes: 0
Views: 161