Reputation: 689
How can I execute a PHP file, while connecting a bucket in Amazon using S3 SDK?
I tried with $s3->getBucket
, I got the following array. But I need the results of execution of this file.
Array
(
[content/content.php] => Array
(
[name] => content/content.php
[time] => 1353105202
[size] => 1223
[hash] => 355c51389b51814d4b6c822a6ec28cfe
)
)
Is there any function/method to execute a PHP file?
Upvotes: 1
Views: 2363
Reputation: 389
Depending on the allow_url_fopen and allow_url_include ini settings on the php, this could be implemented using a custom stream wrapper. We are using one such thing in a project of ours. Not authorized to share the code as it is having IPR. But it is based on http://docs.aws.amazon.com/aws-sdk-php/v2/guide/feature-s3-stream-wrapper.html
The Amazon S3 stream wrapper allows you to store and retrieve data from Amazon S3 using built-in PHP functions like file_get_contents, fopen, copy, rename, unlink, mkdir, rmdir, etc.
Upvotes: 0
Reputation: 71384
You could potentially use something like s3fs to mount the S3 bucket in your local filesystem and then just include the file as you would any other file.
Upvotes: 0
Reputation: 26527
You can't execute PHP files directly on Amazon S3.
What you would have to do is download it's contents, write the contents to the file, then run the file itself (by using include() or require()).
If the file outputs stuff directly, you can wrap it in ob_start() and ob_get_clean() to get any output.
$source = $s3->getObject($bucket, $object);
$filename = tempnam('/tmp');
$file = fopen($filename, 'w');
fwrite($file, $source);
fclose($file);
ob_start();
require_once $filename;
$result = ob_get_clean();
echo $result; // the result of the executed PHP.
Upvotes: 3