Reputation: 1
I’m building a cricket-specific chatbot using CrewAI. The chatbot is designed to answer queries based on historical cricket data in CSV files and real-time data from APIs. The data is stored in an SQL database. However, I am experiencing slow response times for answering user queries, with response times ranging from 30 seconds to a few minutes. This delay is not ideal for a chatbot, where users expect quick and interactive responses.
Initially, I attempted to implement a Retrieval-Augmented Generation (RAG) system, but I found that it wasn’t efficient for my use case. I then switched to using SQL agents to query the database directly, assuming this would improve the speed of query processing. CrewAI is being used to handle the queries, retrieve relevant data from the database, and generate responses.
The Current Workflow: The data is stored in an SQL database. CrewAI is being used to handle the interaction with the database and provide responses. The LLM (experimenting with OpenAPI and Meta Llama) queries the SQL database through an agent to retrieve relevant information and formulate a response.
Challenges: The time taken to generate a response is too long. Users expect near-instantaneous replies, but the current setup is far from that.
Upvotes: 0
Views: 55