Roopesh Krishna
Roopesh Krishna

Reputation: 57

JSON file with same name in GCP bucket not updating quickly when try to over write

I have a file in GCP bucket when I try to update the file with new changes it's not reflecting quickly in public url. If I click the json file, I can see that the content has been changed, but if I click the public url, the content still remains not updated. I have used node.js. My code will look like this

const destFileName = 'data.json';
const file = myBucket.file(destFileName);
const fData = files;
passthroughStream.write(fData);
passthroughStream.end();
async function streamFileUpload() {
  passthroughStream.pipe(file.createWriteStream({ metadata: {
    cacheControl: 'no-store'
  }})).on('finish', () => {
    console.log('Stream file upload function is completed ');
  });
  console.log(`${destFileName} uploaded to ${bucketName}`);
}
await streamFileUpload();

Is there any method to update the json file and get the public link updating instantly?

Thank you!

Upvotes: 0

Views: 467

Answers (1)

Brandon Yarbrough
Brandon Yarbrough

Reputation: 38389

Publicly readable objects default to being cacheable. If you want a public link to offer strong consistency, you'll need to explicitly disable caching for that object, for example by setting a cacheControl property like no-cache, max-age=0. If you're uploading with gsutil, you could do it like this:

gsutil -h "Cache-Control:no-cache, max-age=0" \
    cp -a public-read myfile.json gs://mybucket

Alternately, you could put some random URL query parameter into your URL to skip any caches between you and Cloud Storage, like dodgeCache=12345.

Upvotes: 1

Related Questions