tronnlives
tronnlives

Reputation: 31

How to keep conversation context of multiple users separate for LLM chatbot coded in Python-Flask

Apologies in advance as this is probably an easy thing to fix. I'm still new to programming and learning as I go.

I'm working on a chatbot using Python Flask, OpenAI's LLM, and the LLAMA Index library. It’s currently being hosted off AWS.

Right now, everyone who uses it, no matter what the device, shares the same chat history. So, if one person says their name is Bob, anyone can ask "What's my name?" and the bot will answer "Bob." I want to make it so each person has their own chat history with the bot, even if they switch between devices. This way, the chatbot can remember conversations with individual users, not mix everyone's chats together.

I know LLMs are stateless so each API call should be a new call, and I know that I am using Llama index to pass in chat history but I don't know how to start a new session for each device/user.

I’ve tested this outside of AWS on my local machine using different browsers so I do not think its NGINX.

Ive also tried the Flask Sessions library but have not had any luck.

Has anyone done something like this or have any tips on where to start?

Upvotes: 3

Views: 2849

Answers (1)

Franky
Franky

Reputation: 17

This is a old question but I hope this answer can help someone.

If you are querying the llm using the ollama python package it's very easy, when calling the chat method from the Client class you can pass the chat history in the messages parameter. The parameter accepts a list of dictionaries. The dictionaries in the list must have values for the "role" and "content" keys.

This is an example:

from ollama import Client
client = Client(
    host='http://localhost:11434',
    headers={
        'x-some-header': 'some-value'
    }
)

# Messages you stored somewhere
llmMessages = [
    {
        'role': 'user',
        'content': 'what color is the sky?' # old question from the user
    },
    {
        'role': 'system',
        'content': 'the sky is blue' # old answer from the system
    }
]

userMessage={
    'role': 'user',
    'content': 'why is it blue?' # new question from the user
}
llmMessages.append(userMessage) # adds the new question to the list of messages

stream = client.chat(
    model="llama3.2",
    messages=llmMessages,
    stream = True,
    options = {
        "temperature": 0
    }
)

you can read more here.

Upvotes: 1

Related Questions