Reputation: 592
Is it possible to use gsutil in a firebase functions trigger? I would like to update variables containing multiples folders sizes each time a user add or delete a file in a storage bucket, so use gsutil du -s gs://[BUCKET_NAME]/[FOLDER_NAME]
in a storage trigger.
Also there is maybe an other solution to get buckets folders sizes than gsutil...
Upvotes: 1
Views: 1016
Reputation: 1360
You can use the @google-cloud/storage NPM module and specifically Bucket.getFiles() to iterate through the list of files and add up their size. (I believe this is what gsutil du
does under the hood). Note that if you have over 1,000 files returned you'll need handle paging the results.
Running the following function over HTTP you should get output like:
3 files: 9.55MB / 10008960
index.js
const functions = require('firebase-functions');
exports.bucketSizer = functions.https.onRequest((req, res) => {
const storage = require('@google-cloud/storage')();
const bytes = require('bytes');
const bucketName = 'MY_BUCKET_NAME';
const dir = 'MY_DIRECTORY_NAME/';
const options = {
prefix: dir,
};
storage
.bucket(bucketName)
.getFiles(options)
.then(results => {
const files = results[0];
let totalSize = 0;
console.log('Files:');
files.forEach(file => {
console.log(file.name);
let fileSize = parseInt(file.metadata.size);
console.log(fileSize);
totalSize += fileSize;
});
console.log("Total file size--> ", totalSize);
res.send(`${files.length} files: ${bytes(totalSize)} / ${totalSize}`);
return true;
})
.catch(err => {
console.error('ERROR:', err);
res.status(500).send(err);
});
});
package.json
$ npm install --save @google-cloud/storage
$ npm install --save bytes
Upvotes: 4