Reputation: 29
def data_input():
# It gets the data that the llm need and the user wants to ask question related to this data
def chat_with_data(data):
messages = [
{
"role": "system",
"content": "You are given some data and you have to analyze the data correctly if the user asks for any output then give the output as per the data and user's question otherwise don't give answer."
},
{
"role": "user",
"content": data
}
]
try:
while True:
user_input = input("Enter your message: ")
if user_input.lower() == 'exit':
print("Exiting the chat")
break
messages.append({"role": "user", "content": user_input})
assistant_message = ''
ollama_response = ollama.chat(model='llama3.2', messages=messages, stream=True)
for chunk in ollama_response:
assistant_message += chunk['message']['content']
print(assistant_message, flush=True)
messages.append({"role": "assistant", "content": assistant_message})
except Exception as e:
ic(e)
data = data_input()
chat_with_data(data)
The LLM follows the instruction and gives a long summary of the data and the program ends but the program should run in a loop till the user didn't type exit. Should I changet the condition while True?
Upvotes: 0
Views: 63
Reputation: 9
You have to swap your while True
with your try
statement. Your program currently just executes the try
statement only once. Try the following method:
while True:
user_input = input("Enter your message: ")
if user_input.lower() == 'exit':
print("Exiting the chat")
break
try:
messages.append({"role": "user", "content": user_input})
assistant_message = ''
ollama_response = ollama.chat(model='llama3.2', messages=messages, stream=True)
for chunk in ollama_response:
assistant_message += chunk['message']['content']
print(assistant_message, flush=True)
messages.append({"role": "assistant", "content": assistant_message})
except Exception as e:
ic(e)
data = data_input()
chat_with_data(data)
Upvotes: 0