Reputation: 3650
I am using AWS API to invoke AWS Step Functions. However, there are limits to the number of functions that you can call.
According to the AWS Step Function limits, some Step Functions API actions are throttled using a token bucket scheme to maintain service bandwidth.
For instance, StartExecution()
has a bucket size of 100 and a refill rate of 2/second.
So, is there a way to monitor the available slots in the bucket?
I looked at the AWS python API, it's called boto3
, and there does not seem to be a way to obtain this info. But I am not sure.
What would happen to the excess calls, are they queued? or just discarded?
Upvotes: 1
Views: 1821
Reputation: 178984
They're rejected with an error. Since this is a protective control, queueing wouldn't make sense, and just discarding excessive requests wouldn't be a sound design practice.
Throttling
The request was denied due to request throttling.
HTTP Status Code: 400
http://docs.aws.amazon.com/step-functions/latest/apireference/CommonErrors.html
You should be able to simply sleep and retry on error, with an exponential backoff. In it's simplest form, that means you sleep for 1 second and retry, then 2, then 4, then 8, etc. Even better, add an addtional random interval to each increment, so sleep 1 + rand(1) then 2 + rand(1) or some variation along those lines.
Upvotes: 2