user3173207
user3173207

Reputation: 289

Amazon S3 copyObject using PHP

I am trying to copy a 1TB file from one bucket to another. I know that this can be done easily if I log into the AWS S3 panel but I would like to do it using PHP.

I am using the following AWS S3 class from github

public static function copyObject($srcBucket, $srcUri, $bucket, $uri, $acl = self::ACL_PRIVATE, $metaHeaders = array(), $requestHeaders = array(), $storageClass = self::STORAGE_CLASS_STANDARD)
    {
        $rest = new S3Request('PUT', $bucket, $uri, self::$endpoint);
        $rest->setHeader('Content-Length', 0);
        foreach ($requestHeaders as $h => $v) $rest->setHeader($h, $v);
        foreach ($metaHeaders as $h => $v) $rest->setAmzHeader('x-amz-meta-'.$h, $v);
        if ($storageClass !== self::STORAGE_CLASS_STANDARD) // Storage class
            $rest->setAmzHeader('x-amz-storage-class', $storageClass);
        $rest->setAmzHeader('x-amz-acl', $acl);
        $rest->setAmzHeader('x-amz-copy-source', sprintf('/%s/%s', $srcBucket, rawurlencode($srcUri)));
        if (sizeof($requestHeaders) > 0 || sizeof($metaHeaders) > 0)
            $rest->setAmzHeader('x-amz-metadata-directive', 'REPLACE');

        $rest = $rest->getResponse();
        if ($rest->error === false && $rest->code !== 200)
            $rest->error = array('code' => $rest->code, 'message' => 'Unexpected HTTP status');
        if ($rest->error !== false)
        {
            self::__triggerError(sprintf("S3::copyObject({$srcBucket}, {$srcUri}, {$bucket}, {$uri}): [%s] %s",
            $rest->error['code'], $rest->error['message']), __FILE__, __LINE__);
            return false;
        }
        return isset($rest->body->LastModified, $rest->body->ETag) ? array(
            'time' => strtotime((string)$rest->body->LastModified),
            'hash' => substr((string)$rest->body->ETag, 1, -1)
        ) : false;
    }

I am using it in my PHP code as follows:

$s3 = new S3(AWS_ACCESS_KEY, AWS_SECRET_KEY);
$s3->copyObject($srcBucket, $srcName, $bucketName, $saveName, S3::ACL_PUBLIC_READ_WRITE);

I'm getting no error_log. What am I doing wrong that I am missing, please?

Upvotes: 0

Views: 3671

Answers (1)

user149341
user149341

Reputation:

At 1 TB, the object is too large to copy in a single operation. To quote from the S3 REST API documentation:

You can store individual objects of up to 5 TB in Amazon S3. You create a copy of your object up to 5 GB in size in a single atomic operation using this API. However, for copying an object greater than 5 GB, you must use the multipart upload API.

Unfortunately, it doesn't appear that the S3 class you're using supports multipart uploads, so you'll need to use something else. I'd strongly recommend that you use Amazon's AWS SDK for PHP — it's a bit bigger and more complex than the one you're using right now, but it supports the entirety of the S3 API (as well as other AWS services!), so it'll be able to handle this operation.

Upvotes: 1

Related Questions