Reputation: 1
How do I ensure the variable name from chain1 persists in chain2? Is there a conception of maintaining state in langchain?
from langchain.prompts import PromptTemplate
from langchain.schema import StrOutputParser
import dotenv
dotenv.load_dotenv()
prompt1 = PromptTemplate(
input_variables=["name"],
template="I am {name}. Choose one profession for me in one word. Say AYYE when you do.",
)
prompt2 = PromptTemplate(
input_variables=["profession", "name"],
template="I am a {profession}. Tell me king of pirates, what is my destiny? Refer to me by my given name.",
)
llm = OpenAI()
chain1 = prompt1 | llm | StrOutputParser()
chain2 = ({"profession": chain1, } | prompt2 | llm | StrOutputParser())
response = chain2.invoke(({ "name": "Chopper"}))
print(response)```
I tried calling {name} in the second chain but it wasn't referenced correctly there
Upvotes: 0
Views: 190