Pratap Vhatkar
Pratap Vhatkar

Reputation: 701

Resize all images stored in Firebase Storage

I have uploaded around 10k high-resolution images on Firebase Storage Bucket. As of now due high bandwidth I need to scale down those images from the bucket without deleting the original image.

Firebase image resizes extensions function solves this problem but only for those images which have been uploaded newly It will not work for the images which have been already uploaded.

So, Is there any way to do this. I am aware that we use cloud functions to resize the images but I am not sure how will I achieve this, cause there is no trigger for the cloud functions to work.

Upvotes: 4

Views: 5371

Answers (6)

Richard
Richard

Reputation: 123

The firebase resize extension will do this automatically for you:

https://firebase.google.com/products/extensions/firebase-storage-resize-images

Upvotes: 0

rasta_boy
rasta_boy

Reputation: 629

Firebase image resizes extensions is trigger on event "google.storage.object.finalize".

So as a gcp documentation say :

This event is sent when a new object is created (or an existing object is overwritten, and a new generation of that object is created) in the bucket

For overwritten an object we need to create a copy of this object with the same name, to accomplish this action, we can use Node.js Admin SDK with the methode getFiles() thanks Renaud Tarnec, and then file.copy()

const { Storage } = require("@google-cloud/storage");
const storage = new Storage();

// Don't forget to replace with your bucket name
const bucket = storage.bucket("bucket-name.appspot.com"); 

const triggerBucketEvent = async () => {
  bucket.getFiles(
    {
      prefix: "public", // you can add a path prefix
      autoPaginate: false,
    },
    async (err, files) => {
      // Copy each file on same directory with the same name
      await Promise.all(files.map((file) => file.copy(file.metadata.name)));
    }
  );
};

triggerBucketEvent();

Upvotes: 0

Konstantin Tarkus
Konstantin Tarkus

Reputation: 38428

Alternatively, you can create a Google Cloud Function (URL endpoint) that can be used to fetch images that will be resized and cached to both GCS and CDN at runtime. For example:

https://i.kriasoft.com/sample.jpg - original image https://i.kriasoft.com/w_200,h_100/sample.jpg - dynamically resized image

$ npm install image-resizing --save
const { createHandler } = require("image-resizing");

module.exports.img = createHandler({
  // Where the source images are located.
  // E.g. gs://s.example.com/image.jpg
  sourceBucket: "s.example.com",

  // Where the transformed images need to be stored.
  // E.g. gs://c.example.com/image__w_80,h_60.jpg
  cacheBucket: "c.example.com",
});

https://github.com/kriasoft/image-resizing

Upvotes: 1

bhr
bhr

Reputation: 2337

Here's a quick and dirty NodeJS function that will trigger the Firebase image resize extension. Works only on jpg/jpeg files.

I am copying the file to the same destination. It's important to set the previous metadata again after copying as it gets lost through the copy operation.

const storage = await getStorage();
const bucketName = `${process.env.PROJECT_ID}.appspot.com`; //replace this with the bucket you want to run on
const bucket = storage.bucket(bucketName);
const files = await bucket.getFiles();
for (const file of files[0]) {
  const isImage = file.name.toLowerCase().includes('jpg') || file.name.toLowerCase().includes('jpeg');
  const isThumb = file.name.toLowerCase().includes('thumbs');
  if (isImage && !isThumb) {
    console.info(`Copying ${file.name}`);
    const metadata = await file.getMetadata();
    await file.copy(file.name);
    await file.setMetadata(metadata[0]);
  }
}

Upvotes: 2

Doug Stevenson
Doug Stevenson

Reputation: 317778

If your content is already uploaded to a Cloud Storage bucket, there isn't any sort of Cloud Function or simple code to deploy that will do all this automatically. Cloud Functions only respond to events that happen within a bucket, such as the creation or deletion of objects.

What you'll end up having to do is writing some code to list all your objects, then for each one, download it, modify it locally, then upload the modified version back to the bucket. It could be lengthy, depending on on how much content you have. If you just want to do this a single time, you're best off just writing a script and running it in Cloud Shell, so you'll minimize the amount of time spent transferring the objects.

Upvotes: 2

Renaud Tarnec
Renaud Tarnec

Reputation: 83183

There is an official Cloud Function sample which shows how to resize images (it is actually quite similar to the Cloud Function that underlies the Resize Images extension).

This sample Cloud Function triggers on upload of a file to the Firebase project's default Cloud Storage bucket, but it should be easy to adapt its trigger.

You could, for example, write a callable Cloud Function that uses the Node.js Admin SDK getFiles() method to list all the files of your bucket and resize each file.

Since you have a lot of files to treat, you should most probably work by batch, since there is a limit of 9 minutes for Cloud Function execution.

Upvotes: 5

Related Questions