Reputation: 879
I want to get the size of a folder without looping through all the files in laravel. The folder is in Amazon S3. My current code is:
$files = Storage::allFiles($dir);
foreach ($files as $file) {
$size+= Storage::size($file);
}
I want to avoid the looping is there any way to accomplish this.
Upvotes: 0
Views: 4746
Reputation: 643
Using listContents you can get an array of files including filesizes and then you can map that array into a total size.
$disk = Storage::disk('s3');
$size = array_sum(array_map(function($file) {
return (int)$file['size'];
}, array_filter($disk->listContents('your_folder', true /*<- recursive*/), function($file) {
return $file['type'] == 'file';
})));
Upvotes: 4
Reputation: 46849
The other option you have, if you can deal with a day old stats, is the newly released 'S3 Storage Inventory' feature.
S3 can put out a daily (or weekly) file that has an inventory of all of your objects in the folder, including size:
http://docs.aws.amazon.com/AmazonS3/latest/dev/storage-inventory.html
Amazon S3 inventory is one of the tools Amazon S3 provides to help manage your storage. You can simplify and speed up business workflows and big data jobs using the Amazon S3 inventory, which provides a scheduled alternative to the Amazon S3 synchronous List API operation. Amazon S3 inventory provides a comma-separated values (CSV) flat-file output of your objects and their corresponding metadata on a daily or weekly basis for an S3 bucket or a shared prefix (that is, objects that have names that begin with a common string).
You can configure what object metadata to include in the inventory, whether to list all object versions or only current versions, where to store the inventory list flat-file output, and whether to generate the inventory on a daily or weekly basis. You can have multiple inventory lists configured for a bucket. For information about pricing, see Amazon S3 Pricing.
Upvotes: 1
Reputation: 269320
Amazon CloudWatch provides automatic metrics for the number of objects stored in a bucket and the storage space occupied. I'm not sure how often these metrics are updated, but that would be the simplest to use. However, this measures the whole bucket rather than just a particular folder.
See: Amazon Simple Storage Service Metrics and Dimensions
Upvotes: 0
Reputation: 5537
There is no way to compute the size of a folder without recursively looping through it.
A quick command line solution is using du
.
du -hs /path/to/directory
will output the disk usage.
-h
is to get the numbers "human readable", e.g. get 140M instead of 143260 (size in KBytes)-s
is for summary (otherwise you'll get not only the size of the folder but also for everything in the folder separately)Upvotes: 0