Aleš
Aleš

Reputation: 791

amazon multipart upload PHP sdk version 2

I tried to upload large file into Amazon s3 using PHP. I have found nice solutions on various forums but these solutions are for SDK version 1 .

http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFilePHP.html

Of course, I have found examples on Amazon API documentation. This example expects file on local disk and can not handle with input stream.

I couldn't find similar examples for the SDK for PHPv2 as shown in first link.

Did someone solved similar problem successfully?

Upvotes: 4

Views: 3297

Answers (1)

Jeremy Lindblom
Jeremy Lindblom

Reputation: 6517

I recently just prepared a code sample for this. In this example I am using a file, but you can use a stream as well.

use Aws\S3\S3Client;
use Aws\Common\Enum\Size;

// Instantiate the client.
$s3 = S3Client::factory(array(
    'key'    => '*** your-aws-access-key-id ***',
    'secret' => '*** your-aws-secret-key ***'
));

$file = fopen($filename, 'r');

// 1. Create a new multipart upload and get the upload ID.
$response = $s3->createMultipartUpload(array(
    'Bucket' => $bucket,
    'Key'    => $keyname
);
$uploadId = $result['UploadId'];

// 2. Upload the data in parts.
$parts = array();
$partNumber = 1;
while (!feof($file)) {
    $result = $s3->uploadPart(array(
        'Bucket'     => $bucket,
        'Key'        => $key,
        'UploadId'   => $uploadId,
        'PartNumber' => $partNumber,
        'Body'       => fread($file, 5 * Size::MB),
    ));
    $parts[] = array(
        'PartNumber' => $partNumber++,
        'ETag'       => $result['ETag'],
    );
}

// 3. Complete multipart upload.
$result = $s3->completeMultipartUpload(array(
    'Bucket'   => $bucket,
    'Key'      => $key,
    'UploadId' => $uploadId,
    'Parts'    => $parts,
));
$url = $result['Location'];

fclose($file);

Upvotes: 4

Related Questions