Burny
Burny

Reputation: 11

Llamaindex Bug: ToolInteractiveReflectionAgentWorker not doing corrective reflection

I tried exactly the code here line by line but with a different contents of the tool (shouldn't matter): https://docs.llamaindex.ai/en/stable/examples/agent/introspective_agent_toxicity_reduction/ https://www.youtube.com/watch?v=OLj5MFNHP0Q with main_agent_worker, because it being None crashes it:

 File "/home/burny/.local/lib/python3.11/site-packages/llama_index/agent/introspective/step.py", line 149, in run_step
    reflective_agent_response = reflective_agent.chat(original_response)
                                                      ^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'original_response' where it is not associated with a value

But on one device I see no LLM critic responces in terminal, and on other device with the same exact code I see:

=== LLM Response ===
Hello! How can I assist you today?
Critique: Hello! How can I assist you today?
Correction: HTTP traffic consisting solely of POST requests is considered suspicious for several reasons:

with no correction actually happening in the two agent communication.

I tried downgrading to llamaindex version at the time of when that example was written, but I get same behavior

pip install --upgrade --force-reinstall \
llama-index-agent-introspective==0.1.0 \
llama-index-llms-openai==0.1.19 \
llama-index-agent-openai==0.2.5 \
llama-index-core==0.10.37

Upvotes: 1

Views: 22

Answers (0)

Related Questions