kostya ivanov
kostya ivanov

Reputation: 707

How to realize streaming response from Ollama local LLM in the Streamlit App?

I'm a little confused in the documentation of the components. I need to implement a stream output from a local language model to a stream interface. I know that there is a new method st.write_stream, but I don't understand how to use it, because I get an error that my response from the language model is a string.

Some code:

from langchain import LLMChain, PromptTemplate
from langchain.memory import ConversationBufferMemory
from langchain.llms import Ollama
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain.callbacks.manager import CallbackManager

callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])
llm = Ollama(model=st.session_state.selected_model, callbacks=callback_manager)
    
initial_instruction = """You are a multifunctional chatbot created to help users. You must remember the context of the dialogue and provide the most accurate answers to the user's requests.
    """
    
prompt_template = PromptTemplate(
        input_variables=["input", "chat_history"],
        template=f"""
        {initial_instruction}

        Chat history:
        {{chat_history}}

        User: {{input}}
        """.strip()
    )
    
st.session_state.llm_chain = LLMChain(llm=llm, prompt=prompt_template, memory=st.session_state.memory)

for message in st.session_state.messages:
    if message["role"] is "user":
        with st.chat_message(name = "user", avatar = "source/ai_user_2.png"):
            st.write(message["content"])
    
elif message["role"] is "assistant":
        with st.chat_message(name = "assistant", avatar = "source/ai_bot_2.png"):
            st.write(message["content"])
                
if prompt := st.chat_input(""):
    with st.chat_message(name="user", avatar="source/ai_user_2.png"):
        st.write(prompt)
        st.session_state.messages.append({"role": "user", "content": prompt})
    
    with st.chat_message(name="assistant", avatar="source/ai_bot_2.png"):

        streamed_response = st.write_stream(st.session_state.llm_chain.run(input=prompt))
        st.session_state.messages.append({"role": "assistant", "content": streamed_response})

Gets error: StreamlitAPIException: st.write_stream expects a generator or stream-like object as input not <class 'str'>. Please use st.write instead for this data type.

How I can stream response?

Upvotes: 0

Views: 1743

Answers (1)

Arindam
Arindam

Reputation: 81

You need a runnable/generator for this. Try this:

streamed_response = st.write_stream(st.session_state.llm_chain.stream(input=prompt))

Read the documentation properly.

Upvotes: 0

Related Questions