Reputation: 7
I'm developing a tool using langchain and openAI which requires the use of some custom datasets that I feed in. I am able to get this working using the following.
llm = AzureChatOpenAI(openai_api_base=OPENAI_API_BASE,
openai_api_key=OPENAI_API_KEY,
deployment_name=deployment_name,
model_name=deployment_name,
openai_api_version=OPENAI_DEPLOYMENT_VERSION)
agent = create_pandas_dataframe_agent(llm,
[dataset1, dataset2],
verbose=False,
agent_type=AgentType.OPENAI_FUNCTIONS,
)
But problem is, I need to insert prompt template into the mix. I have a custom prompt as follows:
from langchain import PromptTemplate
multi_input_template = """
You are an expert in {data}.
{query}
"""
multi_input_prompt = PromptTemplate(input_variables=["data", "query"], template=multi_input_template)
Which I can query as follows:
multi_input_prompt.format(data="AIDS", query="What diagnosis are suitable if presenting with cancer?")
But problem is, if I do this, I cant integrate my custom datasets into the mix.
Any recomendations are greatly appreciated. I am using langchain==0.0.336 and langchain-experimental==0.0.42
Upvotes: 0
Views: 860
Reputation: 1
It seems like you want to integrate your custom datasets into the prompt template along with the query. If I understand correctly, you want to use your datasets as part of the input to the model.
You can do this by making some changes in your template:
multi_input_template = """
You are an expert in {data}.
{query}
Here is some information from dataset 1: {dataset1}
And here is some information from dataset 2: {dataset2}
"""
multi_input_prompt = PromptTemplate(
input_variables=["data", "query", "dataset1", "dataset2"],
template=multi_input_template
)
formatted_prompt = multi_input_prompt.format(
data="AIDS",
query="What diagnosis are suitable if presenting with cancer?",
dataset1=dataset1,
dataset2=dataset2
)
Upvotes: -1