Gene
Gene

Reputation: 11

How does Amazon S3 scale multiple and large file uploads?

I'm using AWS Python SDK to put objects into Amazon S3 storage.

Suppose I'm uploading 100 different files (size vary from MB to GB) continuously, how does Amazon S3 handle this scenario?

Do I need to wait for each request's success or can I upload continuously, and S3 will handle it internally?

If S3 takes care of this scenario, does it have any built-in mechanism like a queue or parallel processing?

Upvotes: 1

Views: 429

Answers (1)

John Rotenstein
John Rotenstein

Reputation: 270114

Amazon S3 is a highly scaled, distributed system that handles millions of transactions per second. It is used by very large services that put it under a much greater load than your requirements. For example, in Europe it is used to store data from DropBox users.

Amazon S3 can handle your parallel requests, so you do not need to wait between requests, nor queue requests. There might be occasional transient network errors, so your application should detect error messages and retry as appropriate.

The biggest limitation is likely to be your Internet bandwidth (or the bandwidth assigned to an Amazon EC2 instance).

Bottom line: Don't worry. It's big.

Upvotes: 2

Related Questions