Reputation: 7428
I'm attempting to upload many images to an S3 bucket. With speed in mind, I'm looking to reduce the number of disk reads and writes by keeping the data in memory. To this end, I have come up with the following scheme:
//fetch binary image data from remote URL
$contents = file_get_contents("http://somesite.com/image.jpg");
//trim the image as per: http://stackoverflow.com/a/15104071/568884
$out = shell_exec('echo ' . base64_encode($contents) . " | base64 -d | convert - -fuzz 10% -trim jpeg:-");
//create a temporary resource to pass to S3's inputResource() method.
$resource = fopen('php://temp', 'r+');
//write the binary data into the empty resource.
fwrite($resource, $out);
//pass the resource and length of binary data into inputResource()
$ir = $this->s3->inputResource($resource, strlen($out));
//finally transfer the resource from machine to S3.
$this->s3->putObject($ir, $bucket, $s3_path, S3::ACL_PUBLIC_READ);
The error is: S3::putObject(): [RequestTimeout] Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed and data is not written to S3.
If I replace the assignment of $out to simply an empty string: $out = "";
Then, the library successfully writes 0 byte files to S3 as expected.
I'm using CodeIgniter S3 library... which is just a wrapper around AWS S3 API afaik.
Upvotes: 2
Views: 3130
Reputation: 198219
You are passing the file-handle $resource
to the library however, you wrote into it first so that the file-pointer is at the end of the file.
The library probably can not deal with that edge-case (it's source-code suggests that).
You could try to rewind($resource)
the file after writing to it but before passing it into the S3 library.
And if you want to speed this up a little bit you can make PHP use memory for smaller files if you want by switching php://temp
to php://memory
. See the php://
wrapper docs for details and options.
The S3 library btw. is not an official one. If you enable notices and warnings you will likely see some problems reported because it still contains PHP 4 code.
Upvotes: 1
Reputation: 6517
A possible source of the RequestTimeout error could be that your call to putObject is specifying a different Content-Length than the actual data being sent. Per an Amazon representative in the AWS forums:
One way you could trigger a RequestTimeout error is to send a PUT request that specifies a Content-Length of 2 but includes only 1 byte of object data in the request body. After waiting 20 seconds for the remaining byte to arrive, Amazon S3 will respond with a RequestTimeour error.
So it's possible that your temp file is reporting the wrong length when you use the strlen() function, and that this incorrect value is causing S3 to respond with the exception.
According to comments on the php documentation for strlen(), the function can report the wrong number of bytes in a file because it assumes strings are always ASCII:
If that is the case it might treat binary data as unocode string and return wrong value
Try replacing strlen() with a call to mb_strlen(), which should always report the correct number of bytes.
Upvotes: 0