Reputation: 111
I am trying to find out if it's possible to fetch multiple objects from S3 in PHP.
Performing an http request for each object in a blocking and sequential way can hit performance quite hard and I was wondering if there is a way to get round this. The API doesn't seem have a batch get end point, only batch delete or copy. I was thinking maybe Guzzle could help me do it in parallel if AWS doesn't provide a way for me to do this.
Any help would be really appreciated.
Upvotes: 1
Views: 1034
Reputation: 6527
You should be able to do concurrent requests following Guzzle's conventions via the AWS SDK. See Executing commands in parallel in the SDK's user guide.
Upvotes: 1