Joel
Joel

Reputation: 111

AWS S3 batch get with PHP SDK, or parallel usage for increased performance

I am trying to find out if it's possible to fetch multiple objects from S3 in PHP.

Performing an http request for each object in a blocking and sequential way can hit performance quite hard and I was wondering if there is a way to get round this. The API doesn't seem have a batch get end point, only batch delete or copy. I was thinking maybe Guzzle could help me do it in parallel if AWS doesn't provide a way for me to do this.

Any help would be really appreciated.

Upvotes: 1

Views: 1034

Answers (1)

Jeremy Lindblom
Jeremy Lindblom

Reputation: 6527

You should be able to do concurrent requests following Guzzle's conventions via the AWS SDK. See Executing commands in parallel in the SDK's user guide.

Upvotes: 1

Related Questions