Reputation: 18983
I have a sequential autogen chat with relatively high max_turns number and large context. My idea was to use low cost model for my_agent's llm_config and smarter/expensive model that does final reflection/summarization.
# initiate_chats argument
{
"recipient": my_agent,
"max_turns": 5,
"message": f'Original Input Text:\n{message1}',
"summary_args": {
# ideally set model here that's different than my_agent's model
},
"summary_method": "reflection_with_llm",
},
# initiate_chats argument
How can something like this be achieved in autogen?
Upvotes: 0
Views: 36
Reputation: 91
You can define a summarizer
agent with a cheap LLM such as GPT-3.5 and have a custom function to use that agent for message summarization. Something like this:
import autogen
from autogen import ConversableAgent
api_key = open('api.txt').read().strip()
llm_config = { "config_list": [
{ "model": "gpt-4o", "api_key": api_key },
{ "model": "gpt-3.5-turbo", "api_key": api_key },
] }
cheap_filter = {"model": ["gpt-3.5-turbo"]}
john = ConversableAgent(
name="John",
system_message="You are a comedian in a play role.",
llm_config=llm_config,
human_input_mode="NEVER",
max_consecutive_auto_reply=6,
)
david = ConversableAgent(
name="David",
system_message="You are a comedian in a play role.",
llm_config=llm_config,
human_input_mode="NEVER",
max_consecutive_auto_reply=3,
description="Writes Frontend code for the website and start each task.",
)
summarizer = ConversableAgent(
name="Summarizer",
system_message="You are a chat summarizer. You receive one chat message at a time and summarize it.",
llm_config={'config_list': autogen.filter_config(llm_config['config_list'], cheap_filter)},
human_input_mode="NEVER",
max_consecutive_auto_reply=1,
)
def summarizeChat(sender, receiver, summary_args):
summarizer = summary_args['summarizer']
result = receiver.initiate_chat(
summarizer,
message=receiver.last_message(sender)['content'],
max_turns=1,
silent=True
)
return result
david.initiate_chat(john,
message="Do you know why the computer went to the doctor?",
summary_method=summarizeChat,
summary_args={"summarizer": summarizer})
Upvotes: 1