Reputation: 1293
I have started to implement openai gpt model in python. I have to send a single request in which I am getting RateLimitError.
My code looks like this
import openai
key = '<SECRET-KEY>'
openai.api_key = key
model_engine = 'text-ada-001'
prompt = 'Hi, How are you today?'
completion = openai.Completion.create(engine=model_engine, prompt=prompt, max_token=2048, n=1, stop=None, temprature=0.5)
print(completion.choices)
This is what error I am getting
openai.error.RateLimitError: You exceeded your current quota, please check your plan and billing details.
So, How do I do development without getting this error? I have checked the doc they provide a free version with limitations but this is the initial stage I have sent only 5-6 requests in an hour.
Thanks advance for your help.
Upvotes: 6
Views: 17995
Reputation: 79
Later versions (>= 1.0.0
) of the openai Python API include functionality to automatically retry requests. See GitHub discussion.
Upvotes: 3
Reputation: 79
You can use e.g. https://github.com/phelps-sg/openai-pygenerator to automatically retry requests when a RateLimitError
occurs.
Edit: the new 1.0.0 Python API includes functionality for automatically retrying requests.
Upvotes: 1
Reputation: 5844
This probably stems from the server being overloaded. There is an article on OpenAI's help subdomain.
If you encounter a RateLimitError, please try the following steps:
- Wait until your rate limit resets (one minute) and retry your request. The error message should give you a sense of your usage rate and permitted usage.
- Send fewer tokens or requests or slow down. You may need to reduce the frequency or volume of your requests, batch your tokens, or implement exponential backoff. You can read our rate limit guidance here.
- You can also check your usage statistics from your account dashboard.
More helpful info:
The second link has limits that your script should take into account when making requests or retrying requests. All this info was found by doing a web search on "openai.error.RateLimitError".
Upvotes: 2