Elisabeth
Elisabeth

Reputation: 21206

Setting CacheControl on Azure blockBlob, how to invalidate when new files are deployed?

I want to cache some static blobs we have on azure via:

CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("staticJsonFile");
CloudBlockBlob blockBlob = container.GetBlockBlobReference(blobref);

// Cache setup
int expirationSeconds = new DateTime(2050, 1, 1).Subtract(DateTime.Now).Seconds;
string cacheControl = string.Format("public, max-age={0}, s-maxage={0}", expirationSeconds);
blockBlob.Properties.CacheControl = cacheControl;
blockBlob.SetProperties();

They are cached "forever".

When new files with the same name (same files but new content) are uploaded to Azure filestorage, can I somehow say that the cache of the blockBlob is invalid?

Upvotes: 2

Views: 1189

Answers (1)

Adarsha
Adarsha

Reputation: 2377

Because you are setting client side cache control headers as well (public, max-age=), even if you invalidate the server blob cache, the client is not going to check for new files. They will just use the cached version.

You should think about adding a version numbers to your files, and update your source references accordingly.

One other way is to force the clients to issue an if-modified-since request (which needs you adding must-revalidate to your cache control header), so you have a finer control on when a client gets the new file. But you can't use it in your current case as you can't do anything for all browser/apps that have already cached your file.

Forced cache revalidation also results in extra round trips (resulting in http-302 when it's not modified), and can have performance problems if network latency is high.

Upvotes: 2

Related Questions