jolly
jolly

Reputation: 3598

How to save history and restart a chat from last point in Langchain with Ollama?

I have a Ollama Langchain chat system. Once the chat ends, I save the history in DB. But I am not able to load the history for restarting a particular chat again. The code, error and history are as below.

history = {'input': 'What is life?', 'history': 'Human: What is life?\nAI: {}', 'response': '{ "Life" : {\n  "Definition" : "A complex and multifaceted phenomenon characterized by the presence of organization, metabolism, homeostasis, and reproduction.",\n  "Context" : ["Biology", "Philosophy", "Psychology"],\n  "Subtopics" : [\n    {"Self-awareness": "The capacity to have subjective experiences, such as sensations, emotions, and thoughts."},\n    {"Evolutionary perspective": "A process driven by natural selection, genetic drift, and other mechanisms that shape the diversity of life on Earth."},\n    {"Quantum perspective": "A realm where quantum mechanics and general relativity intersect, potentially influencing the emergence of consciousness."}\n  ]\n} }'}

PROMPT_TEMPLATE = """ 
{history}
"""

custom_prompt = PromptTemplate(
    input_variables=["history"], template=PROMPT_TEMPLATE
)

chain = ConversationChain(
    prompt=custom_prompt,
    llm=llm,
    memory=ConversationBufferMemory()
)
prompt = "How to live it properly?"

answer = chain.invoke(input=prompt)

Error: miniconda3/lib/python3.11/site-packages/pydantic/v1/main.py", line 341, in init raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for ConversationChain root Got unexpected prompt input variables. The prompt expects ['history'], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)

How to load the history the right way?

TIA

Upvotes: 0

Views: 2578

Answers (2)

KuLeMi
KuLeMi

Reputation: 406

Use this example:

def some_conversation_chain(system_message, memory):
    
    prompt_template = ChatPromptTemplate.from_messages([    
    ("system", system_message),
    ("user", "{input}")])
    # any: chat_model = ChatOpenAI(temperature=0, model='gtp-4o', api_key=api_key)
    chat_model = ChatGoogleGenerativeAI(model='gemini-1.5-flash', temperature=0, google_api_key=google_api_key)    
    llm_chain = ConversationChain(
        llm=chat_model,
        prompt=prompt_template,
        memory=memory,
        )
    return llm_chain

# IF you want can separate users chat history and save it somewhere, for example in redis db
redis_chat_history = RedisChatMessageHistory(session_id=client_id, url=redis_url) # only if need / optional
chat_memory = ConversationBufferMemory(
        memory_key="chat_history",
        input_key='input',
        chat_memory=redis_chat_history, # only if need / optional        
        return_messages=True
    )

system_message = """
Chat history: {chat_history}
 
You are a helpful AI assistent in ...
"""

some_conv_chain = some_conversation_chain(system_message, chat_memory)
result  = some_conv_chain.invoke({'input': "some request to AI assistent ..."})

print(result['response']) # may be another key not `response` depends on chat model

# if need you can inspect memory such as:
memory_content = chat_memory.load_memory_variables({})
print(memory_content)```

Upvotes: 0

jolly
jolly

Reputation: 3598

I found an answer here: https://github.com/langchain-ai/langchain/discussions/3224

This is one of the easy ways to add history. memory.chat_memory.add_ai_message(message)

Upvotes: 0

Related Questions